Tech Threads

Analysis of CCTV footage

All of us know CCTV is commonly used for a variety of purposes like maintaining perimeter security in medium- to high-secure areas and defense installations, observing behavior of incarcerated inmates and potentially dangerous patients in medical facilities, traffic monitoring etc. The amount of digital data generated by a single camera after compression is about 20 gigabytes per day. The number of camera increases with the increase of premises and its surroundings.


If we are planning to store those data or video for longer duration (for example 6 months), the size of data would exceeds more than hundreds of petabytes which can be termed as Big Data. If any specific incident happens during that period, analysis of CCTV footage is only option to evaluate the actual truth. Hadoop is the only technology or paradigm who can address this effectively within a short span of time using commodity computers.
Written by
Gautam Goswami    

Can be contacted for real time POC developement and handson technical training. Also to develop/support any Hadoop related project. Email:- gautam@onlineguwahati.com. Gautam is a consultant as well as Educator. Prior to that, he worked as Sr. Technical Architect in multiple technologies and business domain. Currently, he is specializing in Big Data processing and analysis, Data lake creation, architecture etc. using HDFS. Besides, involved in HDFS maintenance and loading of multiple types of data from different sources, Design and development of real time use case development on client/customer demands to demonstrate how data can be leveraged for business transformation, profitability etc. He is passionate about sharing knowledge through blogs, seminars, presentations etc. on various Big Data related technologies, methodologies, real time projects with their architecture /design, multiple procedure of huge volume data ingestion, basic data lake creation etc.

Recent Posts

Real-Time Redefined: Apache Flink and Apache Paimon Influence Data Streaming’s Future

Apache Paimon is made to function well with constantly flowing data, which is typical of… Read More

1 week ago

Revolutionize Stream Processing with the Power of Data Fabric

A data fabric is an innovative system designed to seamlessly integrate and organize data from… Read More

4 weeks ago

Bridging the Gap: Unlocking the Power of HDFS-Based Data Lakes with Streaming Databases

Big data technologies' quick development has brought attention to the necessity of a smooth transition… Read More

1 month ago

Which Flow Is Best for Your Data Needs: Time Series vs. Streaming Databases

Data is being generated from various sources, including electronic devices, machines, and social media, across… Read More

2 months ago

Protecting Your Data Pipeline: Avoid Apache Kafka Outages

An Apache Kafka outage occurs when a Kafka cluster or some of its components fail,… Read More

2 months ago

The Significance of Complex Event Processing (CEP) with RisingWave for Delivering Accurate Business Decisions

Complex event processing (CEP) is a highly effective and optimized mechanism that combines several sources… Read More

5 months ago