Artificial Intelligence

Big Data and Artificial Intelligence: A perfect match!

Last week, here in Paris, we participated in the Big Data Paris congress, an event dedicated to innovative developments relating to Big Data. This year, the congress decided to also showcase innovations in Artificial Intelligence through their Lab AI. In fact, Artificial Intelligence is a logical extension to intelligent data analysis; after all, the value of data lies in our capacity to analyze it.

Let’s come back to this successful conference!

What is Big Data anyway?

The expression “Big Data” first appeared in 1997 in scientific articles emphasizing the technological challenges when it comes to the visualization of large volumes of data.

In the 40s, issues relating to the massive production of data and their storage were highlighted through what specialists refer to as “the explosion of information”: in 1994, Fremont Rider, a librarian at Wesleyan University, estimated that the size of libraries in American universities would double every 16 years! According to him, at that rate, the library at Yale University would contain about 200 million books by 2040, that represents about 10 000 kilometers of bookshelves and over 6000 people to reference these resources!

Technological breakthroughs, regarding computer systems components (reports, processors) but also architecture (computers’ internal architecture and machine networks architecture), went hand in hand with this massive production of data and the evolution of the storage capacity. But more than the quantity, it is the difficulty in analyzing the volumes of data stored via classic database management tools which characterized the Big Data phenomenon.

getting-started-checkmarkFor more information on the history of Big Data, read this article by Gil Press A Very Short History Of Big Data which retraces the major milestones in Big Data from the 40s up until today.

AI: an old history updated by Big Data

During a debate on Artificial Intelligence organized by the French Parliamentary Office for evaluation of scientific and technological options (OPECST), Henri Verdier, French State CTO and Deputy to the Secretary General for Government Modernization, reminds that, although Artificial Intelligence is one of the buzzwords at the moment, it actually is an old story that gets renewed regularly thanks to several different phenomena. These include the telescoping of a science, which dates back to the 50s, of mass data, and computing derived from video games with their graphic cards (GPU).

  • The “official” birth of Artificial Intelligence dates back to 1956, during a conference at Dartmouth, where John McCarthy and Marvin Lee Minsky, two young mathematicians, propose a research program based on the principle that the human mind can be decomposed and simulated by a machine. The objective of this is to reproduce human behaviors into machines.
  • With the development of the digital economy and the emergence of web giants like Google, Facebook, Amazon, etc., the production of data has accelerated greatly these past years: It is estimated that 90% data since the beginning of humanity has been collected in the past two years!   

Algorithms: the challenge for Big Data

With an estimated market of $187 billion in 2019 (IDC study), the Big Data and analytics markets have very promising monetization opportunities right now. As previously mentioned, the value of data lies in our capacity to exploit them and link them to one another, and that is the role of algorithms.

According to Alain Bensoussan, lawyer at the Paris Court of Appeal:

 “Data without algorithms is like a violin without its bow”

Nowadays, computer systems can link data very precisely and this is something that companies have started to understand.
The most common examples can be found in marketing: your actions on social media, search engines are analyzed and exploited to suggest a personalized experience and facilitate the act of buying.

Artificial Intelligence requires data, a lot of data to achieve pertinent results: it is thanks to large sets of data that machines have been able to beat humans at games like Jeopardy ( 2011) or Alphago (2016). The objective for Machine Learning and other Artificial Intelligence techniques is to identify recurring models, trends and early signals, based on a model similar to that of the human brain. Even though machines are not yet capable of having an “emotional and sensory intelligence”, they are already winners when it comes to their capacity to analyze significant volumes of data.

This shows complementarity between humans and machines: to avoid wasting time and energy on statistical and theoretical analysis because machines can do them better than us. Thanks to all the technological developments in AI, we can focus on our creativity as well as our emotional and sensory intelligence. Algorithms are there to optimize our choices.

The quality of data and their analysis, will, therefore, help find the best way to evaluate each type of situation and possible solutions, in terms of risks and opportunities. But these evolutions raise the same questions on ethics and the governance of algorithms, which is one of the main challenges for Big Data to accommodate possible drifts.

Any comments? Feel free to share!