Data and AI Experts Share Predictions with Databricks: What the Future Holds for AI, Big Data and Analytics

Developing tools to standardize the machine learning process – essentially, making it repeatable regardless of data sets, tools or specific deployment methods – will definitely impact if and when organizations achieve AI.

AI Gets Leveraged across the Business: AI has been inspiring in showing what’s possible, according to the head of data science at Quby, Stephen Galsworthy.

There are numerous examples spanning sectors of how AI can be truly transformative.

However, there are continuing business realities and internal scaling and process challenges.

So, I see the need for a lot of innovation around the less sexy stuff: Cost optimization tools, automated accounting, and administration of big data/analytics platforms.

Developing Trust with ‘Explainable AI’: 2018 saw an intensified focus on data bias, trust and transparency in AI – an idea that has implications socially, economically and commercially.

According to Mainak Mazumdar, chief research officer at Nielsen, it is critical to develop AI that is explainable, provable and transparent.

This journey towards trusted systems truly starts with the quality of data used for AI training.

This renewed focus in 2018 on labeled data that can be verified, validated and explained is exciting.

It is exciting that ‘Explainable AI’ can lay the foundations for AI systems that can be both generalized across use cases and be trusted.

Innovations with Real Time Data: Stephen Harrison, a data scientist for Rue Gilt Group says streaming in and of itself is not really brand new.

But Rue Gilt Groupe is planning to leverage streaming data for significant innovations in 2019, like real-time recommendations based on up-to-the-minute data from our order management, click tracking, and other systems.

This is especially important for us because we’re a flash sale retail site, with products and online browsing and purchase behaviors changing by the minute.

Deep Learning Pays Dividends: Says Kamelia Aryafar, chief algorithm officer at Overstock, deep learning innovations will create a lot of new AI applications, some of which are already in production and making massive changes in the industry.

We’re currently using deep learning on projects, from email campaigns with predictive taxonomies to personalization modules that infer user style with deep learning.

Deep learning will continue to improve core AI and machine learning algorithms.

Solving the world’s toughest data problems starts with bringing all of the data teams together within an organization.

Data science and engineering teams’ ability to innovate faster has historically been hindered by poor data quality, complex machine learning tool environments, and limited talent pools.

Additionally, organizational separation creates friction and slows projects down, becoming an impediment to the highly iterative nature of AI projects.

Much like in 2018, organizations that leverage Unified Analytics will have a competitive advantage with the ability to build data pipelines across various siloed data storage systems and to prepare labelled data sets for model building, which allows organizations to do AI on their existing data and iteratively do AI on massive data sets.



push(function() { googletag.

display(div-gpt-ad-1439400881943-0); }); Sign up for the free insideBIGDATA newsletter.

.. More details

Leave a Reply