An IoT analytics infrastructure capable of quickly incorporating vast amounts of data from a range of sources in multiple – and frequently incompatible – formats is still a “next-gen” vision.
When machine-to-machine (M2M) platforms open up their data, new applications will leverage intelligence both in objects and the cloud. But evaluating IoT analytics in real time will require an IT infrastructure that can handle big data.
Infrastructures are still geared toward managing structured information. The ability to analyze increasing amounts of unstructured data accurately is one of the significant benefits of merging natural language processing (NLP) with data analysis, so the use of artificial intelligence (AI), machine learning, and NLP for analytics to deal with this volume is growing in importance.
Big data applications require a superior network: backhaul for connected objects, open data center interconnections, and a cloud SDN distributed architecture. Such capabilities reduce complexity and optimize the end-to-end network architecture end goal of “any device, any app, any network,” which will help M2M service providers reduce investments, facilitate partnerships, and speed time-to-market.
Higher speed, lower costs
The ever-increasing cloud migration demonstrates the pressure on big and small companies alike to reduce spend in their data centers and achieve greater flexibility plugging into and out of solutions. This holds true as much for big data and analytics as it does for traditional transaction processing systems. Last year, Netflix completed its massive seven-year move to the Amazon cloud and shut down the remaining data center bits used by its streaming service. It continues to use a data center only to run and manage its DVD business.
Companies are taking a harder look at assigning economic value to information (“infonomics”) to use big data to solve specific problems. They want analytics that give them immediately actionable data, so the pressure will be on IT to deliver results faster. Cloud technologies such as software-defined networks (SDN), network function virtualization (NFV), and data center hosted services will facilitate initial deployment and enable unobstructed growth of IoT applications.
Scalable by design, SDN will help transmit and process the data generated by an explosive number of IoT endpoints without putting the network under further pressure. Service chaining, dynamic load management, and bandwidth-on-demand will make service providers more agile. To reduce operating costs, M2M service providers are looking at infrastructure virtualization. Business modeling by Bell Labs shows that virtualization of the evolved packet core (EPC) may lead to as much as 40% savings in total cost of ownership for M2M services.
As the move to the cloud advances and big data applications evolve, it will be essential for companies to create and share infrastructure that correlates and aggregates vast amounts of data points into analytics. “Moving at the speed of life” will ultimately give users more control over security, multi-modal transportation, retail, e-government, social statistics, environmental measurements and much more.