Post4

All that You Need to know about Automic Application Release Automation (ARA)

Posted by Kaif Akhtar

Automic’s Application Release Automation (ARA) provides a consistent, repeatable and auditable process to automate the packaging and deployment of next-gen applications and update of applications across multiple environments.

ARA also enables consistency and predictability across environments, which significantly reduces manual efforts and human errors while increasing deployment speed and reliability of the application. It also helps increase Visibility and Control via role-based access, peer review of workflows, native approval support, and automatic change tracking.

The existing MOPs are automated through workflows that include environment health check, deployment, validation, notification and rollback if required. Deployments are tracked and recorded while dashboard provides statistics on application deployments across target environments and Data Center teams can have specialized views of execution status across environments.

Advantages of Automic’s Application Release Automation

  1. Easy Drill In and Out: With the introduction of new Automic’s Web Interface, drilling in and out of workflows at any level is efficient, easy and refined. It supports easier development and you can monitor live deployment workflows by drilling down.
  2. Workflow Development: It is one of the most important elements of ARA. It is easy to use and can be worked by more than one team member on a workflow.
  3. Automatic Rollback: ARA can automatically rollback if an issue crops up during the deployment phase. It is the only platform, currently, in the market that provides this feature. The run books in Automic that are used to deploy the artifact have corresponding rollback functionality, which is built-in. It also allows you to rollback manually post-deployment.
  4. Server and Application Configurations: Automic can set the properties of Application. Set property values can be accessed by the application during the workflow execution.
  5. Workflow Versioning: Users can easily roll back workflow execution as it is version controlled. Users can drill down to the previous executions to verify the logs.
  6. Web-based Editor: ARA comes with a fully web-based workflow editor and monitor and has a simple design that facilitates easy workflow development.

Automic’s Web Interface 

  1. It is the responsibility of the System Administrator to setup the login page of Automic ARA.

2. The system administrator will provide users with login details. ARA prevents users from logging into their accounts with two different ids in two tabs of the same browser.

3. After logging in, the users see the following interface (dashboard):

4. Dashboards are a quick way to access objects, tasks and functions. Users can even customize their dashboard if they are authorized to do so by the administrator.

5. The UI of Automic ARA is organized around perspective, which is a functional area that contains access to the function that particular user needs. Depending upon your user roles you will have access to one or more perspectives.

(i) Administrative perspective is where the admin creates and manage users, user group, agents, connection and many more.

(ii) Process Assembly perspective is where developers and object designers create and configure objects and define their logics by writing scripts in objects where they are then executed and tested.

(iii) Process Monitoring is where operators and managers keep an eye on the processes to make sure that the workload is processed everyday smoothly and troubleshoot if something goes wrong.

(iv) Dashboard perspective gives operators and managers quick access to customized views.

System Architecture

Here’s how the system architecture of Automic’s ARA looks like:

Functional Architecture – Deployment Process

The deployment process involves the following steps:

  • Create an application along with the various components that are necessary to perform some functionalities from the perspective of Release Automation.

  • Then create a package to deploy the application to the server via agents. These agents are nothing but simple programs that run in the background on Windows or Linux servers.
  • Design a development workflow with multiple actions to perform some tasks. Actions are combined and linked to each other to perform a single process.
  • Create an operating environment that includes endpoints where the application is deployed.
  • Add deployment targets and assign them to the respective environment. Deployment targets are the servers on which you will deploy the package.
  • Create a login object that will store the login credentials to the servers.
  • Design ARA infrastructure elements for process ownership, variable creation, and logical modeling.
  • Deploy the application by executing the workflow. When executing the workflow, make sure you assign the deployment package and deployment profile. The deployment profile contains the details of the login object and deployment targets.

Closing Lines

The all-inclusive platform provided by ARA enables the development and operations teams to automate the deployment pipeline right from the development stage to the production stage across the environments, which helps in promoting and rejecting versions, enacting automatic rollback of changes whenever necessary, and monitoring environments. All in all, Automic Software Inc. with ARA aims at creating an easily orchestrated and centrally managed deployment pipeline. Thanks to the large set of built-in integrations and plugins, Automic’s deployment automation tool suits all mid-sized organizations across industry verticals.

That is it from us, see you next time. Do let us know your thoughts on ARA in the comments below.

Until next time!

Related Posts

  • Real Time Data Ingestion (DiP) – Spark Streaming (co-dev opportunity)Real Time Data Ingestion (DiP) – Spark Streaming (co-dev opportunity)

    This blog is an extension to that and it focuses on integrating Spark Streaming to Data Ingestion Platform for performing real time data ingestion and visualization. The previous blog DiP (Storm Streaming) showed how…

  • How to Build & Run Workflow with Control-M Automation APIHow to Build & Run Workflow with Control-M Automation API

    Hi readers, we are back with another blog on Control-M Automation. We hope you have already read our previous blog, which gave an overview of Control-M Automation API. In this…

  • Real Time Data Ingestion (DiP) – Apache Apex (co-dev opportunity)Real Time Data Ingestion (DiP) – Apache Apex (co-dev opportunity)

    Data Ingestion Platform This work is based on Xavient co-dev initiative where your engineers can start working with our team to contribute and build your own platform to ingest any…

  • HAWQ/HDB and Hadoop with Hive and HBaseHAWQ/HDB and Hadoop with Hive and HBase

    Hive: Apache Hive is a data warehouse infrastructure built on top of Hadoop for providing data summarization, query, and analysis. HBase: Apache HBase™ is the Hadoop database, a distributed, scalable, big…

  • Introduction to Messaging

    Messaging is one of the most important aspects of modern programming techniques. Majority of today's systems consist of several modules and external dependencies. If they weren't able to communicate with…

  • Hadoop Cluster Verification (HCV)Hadoop Cluster Verification (HCV)

    Verification scripts basically composed of idea to run a smoke test against any Hadoop component using shell script. HCV is a set of artifacts developed to verify successful implementation of…

Leave a Reply

Your email address will not be published. Required fields are marked *