Tech Threads

Pursuit of Artificial Intelligence in Test Automation (ONLINEGUWAHATI – 3.0 Mobile & DT Automation Framework)

At OnlineGuwahati.com, we have always been on the eye out to pick the best happenings in the industry and leverage it to make our tools and frameworks more efficient, smart and tech up to date. As we continued to explore Data Analytics and Artificial Intelligence, we made it our pursuit to change the way our test automation framework really works. Hence we decided to collect our test run outcomes like logs, results, screenshots, pathways, exceptions etc. To analyze it and help our algorithms take smart decisions and drive it with lesser manual intervention !!


Key Features of In-The-Dockyard framework OnlineGuwahati – 3.0 Mobile and DT Automation Framework
1. Test predicto-R
2. Suite optimize-R
3. Coverage analyse-R
4. Regression prioritize-R
Also, we are building the next generation risk-based test prioritizing intelligence which would continuously sample out defect density area wise, understand and analyze the type of failures, capture and understand user usage behavior to build a prioritizing matrix which would be the guiding scale for prioritizing the test in turn.
This helps reduce the delay, overall time and cost of the test automation effort with a much better risk management.
This would also definitely help us eliminate duplicate and redundant tests. Lessing the overall count of test cases to achieve the optimal QUALITY vs QUANTITY formula.

Written by
Gautam Goswami

Can be reached for real-time POC development and hands-on technical training at gautambangalore@gmail.com. Besides, to design, develop just as help in any Hadoop/Big Data handling related task. Gautam is a advisor and furthermore an Educator as well. Before that, he filled in as Sr. Technical Architect in different technologies and business space across numerous nations.
He is energetic about sharing information through blogs, preparing workshops on different Big Data related innovations, systems and related technologies.

 

Page: 1 2

Recent Posts

Transferring real-time data processed within Apache Flink to Kafka

Transferring real-time data processed within Apache Flink to Kafka and ultimately to Druid for analysis/decision-making.… Read More

4 weeks ago

Streaming real-time data from Kafka 3.7.0 to Flink 1.18.1 for processing

Over the past few years, Apache Kafka has emerged as the leading standard for streaming… Read More

2 months ago

Why Apache Kafka and Apache Flink work incredibly well together to boost real-time data analytics

When data is analyzed and processed in real-time, it can yield insights and actionable information… Read More

3 months ago

Integrating rate-limiting and backpressure strategies synergistically to handle and alleviate consumer lag in Apache Kafka

Apache Kafka stands as a robust distributed streaming platform. However, like any system, it is… Read More

4 months ago

Leveraging Apache Kafka for the Distribution of Large Messages (in gigabyte size range)

In today's data-driven world, the capability to transport and circulate large amounts of data, especially… Read More

5 months ago

The Zero Copy principle subtly encourages Apache Kafka to be more efficient.

The Apache Kafka, a distributed event streaming technology, can process trillions of events each day… Read More

6 months ago