Tech Threads

Analysis of CCTV footage

All of us know CCTV is commonly used for a variety of purposes like maintaining perimeter security in medium- to high-secure areas and defense installations, observing behavior of incarcerated inmates and potentially dangerous patients in medical facilities, traffic monitoring etc. The amount of digital data generated by a single camera after compression is about 20 gigabytes per day. The number of camera increases with the increase of premises and its surroundings.


If we are planning to store those data or video for longer duration (for example 6 months), the size of data would exceeds more than hundreds of petabytes which can be termed as Big Data. If any specific incident happens during that period, analysis of CCTV footage is only option to evaluate the actual truth. Hadoop is the only technology or paradigm who can address this effectively within a short span of time using commodity computers.
Written by
Gautam Goswami    

Can be contacted for real time POC developement and handson technical training. Also to develop/support any Hadoop related project. Email:- gautam@onlineguwahati.com. Gautam is a consultant as well as Educator. Prior to that, he worked as Sr. Technical Architect in multiple technologies and business domain. Currently, he is specializing in Big Data processing and analysis, Data lake creation, architecture etc. using HDFS. Besides, involved in HDFS maintenance and loading of multiple types of data from different sources, Design and development of real time use case development on client/customer demands to demonstrate how data can be leveraged for business transformation, profitability etc. He is passionate about sharing knowledge through blogs, seminars, presentations etc. on various Big Data related technologies, methodologies, real time projects with their architecture /design, multiple procedure of huge volume data ingestion, basic data lake creation etc.

Recent Posts

Transferring real-time data processed within Apache Flink to Kafka

Transferring real-time data processed within Apache Flink to Kafka and ultimately to Druid for analysis/decision-making.… Read More

4 weeks ago

Streaming real-time data from Kafka 3.7.0 to Flink 1.18.1 for processing

Over the past few years, Apache Kafka has emerged as the leading standard for streaming… Read More

2 months ago

Why Apache Kafka and Apache Flink work incredibly well together to boost real-time data analytics

When data is analyzed and processed in real-time, it can yield insights and actionable information… Read More

3 months ago

Integrating rate-limiting and backpressure strategies synergistically to handle and alleviate consumer lag in Apache Kafka

Apache Kafka stands as a robust distributed streaming platform. However, like any system, it is… Read More

4 months ago

Leveraging Apache Kafka for the Distribution of Large Messages (in gigabyte size range)

In today's data-driven world, the capability to transport and circulate large amounts of data, especially… Read More

5 months ago

The Zero Copy principle subtly encourages Apache Kafka to be more efficient.

The Apache Kafka, a distributed event streaming technology, can process trillions of events each day… Read More

6 months ago