“Above the Trend Line” – Your Industry Rumor Central for 12/30/2019

There’s no single enterprise that has enough data to the scale of what Amazon has to really provide high accuracy to help employees with their questions.

My belief is that NLP is going to go down the path of AI where there’s a lot of confusion.

” googletag.


push(function() { googletag.

display(div-gpt-ad-1439400881943-0); }); “In 2020, AI will dramatically improve the employee experience (EX),” commented UJET‘s Anand Janefalkar, Founder and CEO.

“The ability to automatically and instantly collect data from across multiple channels, analyze it and provide actionable insight will enable support agents to more quickly, easily and accurately address customer inquiries and come to highly satisfactory issue resolution.

” “Building better AI: The AI community will continue to make progress in building algorithms that train faster, train with lesser data, and generalize better,” commented Suraj Amonkar – Fellow, AI @ Scale, Machine Vision and Conversational AI at Fractal Analytics.

“The use of newer algorithms for data augmentation, few-shot/zero-shot learning will make the cumbersome, deep-learning training process easier and developments in feature representations and generative networks will work towards making models more generalizable.

The use of complex/hybrid series of algorithms to achieve tasks will help build models that scale for complex “real-world” scenarios.

The use of self-supervised methods will hasten the progress of generic models.

The availability of generic “out-of-the-box” models in machine-vision and NLP will continue to evolve fast – but the need for building customized models for real-world challenges would remain.

The use of multi-agent systems would evolve with the need to move towards more generic intelligence capabilities.

” “AI is revolutionizing the buying and selling of complex B2B services, traditionally left to analog processes, by understanding and responding to the complexities of human intent,” commented Keith Hausmann, Chief Revenue Officer at Globality.

“Machine and deep learning are making it possible for users of complex B2B services to define and match complex requirements to ideal trading partners (suppliers) through an intuitive, needs-identification process and a vast understanding of potential trading partner strengths and capabilities.

User experience continues to improve as AI becomes better informed about individual preferences and company requirements with every interaction, especially intangible areas like organizational culture and values.

” “The data warehouse becomes less important as a single source of truth,” commented Hyoun Park, CEO and Principal Analyst at Amalgam.

“Today’s single source replaces data aggregation and duplication with data linkages and late-binding of data sources to bring together the single source of truth on a real-time basis.

This doesn’t mean that data warehouses aren’t still useful; it just means that the single source of truth can change on a real-time basis and corporate data structures need to support that reality.

And it becomes increasingly important to conduct analytics on data, wherever the data may be, rather than be dependent on the need to replicate and transfer data back to a single warehouse.

” “Starting with an investment in cloud data warehouses like Redshift, Snowflake, and BigQuery, companies are also adopting modern data pipeline and ETL tools like Fivetran and Stitch to funnel more data into these structured storage solutions,” commented Peter Bailis, CEO at Sisu.

“What’s next?.Companies will rebuild their diagnostic tools to cope with the influx of richer data.

To handle the dozens of data sources and near-real time data volumes in a typical organization, IT and data teams will rebuild their analytics infrastructure around four key layers: (i) a cloud data warehouse, like Snowflake, BigQuery, Redshift, or Azure; (ii) data pipeline tools like Fivetran and Stitch; (iii) flexible dashboarding and reporting tools like Looker; and (iv) diagnostic analytics tools to augment the abilities of analysts and BI teams.

Beyond 2020, governance comes back to the forefront.

As platforms for analysis and diagnosis expand, derived facts from data will be shared more seamlessly within a business, as data governance tools will help ensure the confidentiality, proper use, and integrity of data improve to the point they fade into the background again.

In 2020, we’ll see a shift in how companies use and perceive analytics.

” “Machine learning has evolved to the point where it can improve call quality, for example,” commented Vonage‘s CMO Rishi Dave.

“When call data is lost or scrambled between call participants, it results in voice transfers that don’t arrive as they should.

AI can help predict those patterns of data change and autocorrect them.

This kind of AI-enabled advancement will lead to higher-quality calls in places—on an airplane or in rural areas—we didn’t expect were possible.

” “As AI and ML continue to be valuable and relevant to the enterprise, there will be an increased need for refined algorithms that can be achieved through graph,” commented Amit Chaudhry, Vice President, Product Marketing at Neo4j.

“We’ve seen that data without context can be damaging for algorithms and that we must be scrutinous of the data supply chain (i.


the learning set) that factors into algorithms.

As we begin to incorporate more context into AI, the algorithms we produce will be able to suss out solutions from more abstract scenarios and effortlessly come up with the answers we need.

” “In the race for the cloud market, the major providers (Amazon AWS, Microsoft Azure, Google Cloud) are doubling down on their AI/ML offerings,” commented Pluralsight instructor Jerry Kurata.

“Prices are decreasing, and the number and power of services available in the cloud are ever increasing,” said Kurata.

“In addition, the number of low cost or free cloud-based facilities and compute engines for AI/ML developers and researchers are increasing.

This removes much of the hardware barriers that prevented developers in smaller companies or locales with limited infrastructure from building advanced ML models and AI applications.

“The importance of efficiently processing neural networks will continue to grow Currently people run just a single network to process one thing,” commented Ashwini Choudhary, co-founder Recogni.

“For example, they run a version of ResNet to complete a particular subtask of perception processing.

There is no redundancy in these networks, so you get what you get.

If that network missed and failed to detect an object, there is no way to figure out you missed something, until and unless there is redundancy in the system.

Designing redundancy is not free.

Once again, the system designers will need to allocate either more processing computation or improve the efficiency.

The way to think about this is similar to using Radar & LiDAR in the system as depth sensors.

While both provide depth information, they are redundant to each other.

There is no such mechanism for processing neural networks since the computation demand is so high and the processing system is way too expensive.

For redundancy, if you have to have two systems, your costs go up 2x.

Redundancy has to be brought in for things to go into production.

As they start to make progress.

Resolution increases – increases demand on computation.

Network with redundancy increases demand on computation even further.

” Sign up for the free insideBIGDATA newsletter.

.. More details

Leave a Reply