Data Scientists Have Developed a Faster Way to Reduce Pollution, Cut Greenhouse Gas Emissions

By Devin Partida, Editor-in-Chief of ReHack.

com Polymeric membranes assist with a wide variety of tasks, including water filtration and gas-vapor separation.

Designing a membrane for the desired function is more time-consuming than people may expect.

However, researchers at Columbia Engineering, Germanys Max Planck Society and the University of South Carolina applied data science to the task to streamline their efforts.

 More specifically, they combined big data with machine learning to strategically design polymer membranes to act as gas filters.

People frequently depend on plastic films and membranes to separate mixtures of simple cases, such as carbon dioxide and methane.

Scientists have also suggested making membranes that segregate carbon dioxide from other gases.

Doing so facilitates tasks like carbon capture and natural gas purification.

 The problem is hundreds of thousands of plastics exist as possible candidates for the membranes.

That sounds like a good thing since it offers an abundance of choices.

However, since each one varies in its chemical makeup, testing and manufacturing a particular material is an exceptionally time-consuming and expensive endeavor.

These realities mean researchers have only examined about 1,000 options as possible membranes for gas separation.

    Scientists concluded that a method of efficiently selecting the best polymers for gas filtration could increase the likelihood of people developing solutions faster and at lower costs.

They made a machine learning algorithm that linked the chemical structures of the polymers tested for this purpose so far with their transport properties.

 The transport properties for gas are:The researchers wanted to use data science to predict the best materials for gas-separation membranes without the usual expensive and lengthy approach.

    After building an algorithm, they used it to assess more than 11,000 previously identified polymers.

They ended up with approximately 100 polymer options not already examined for gas transport capabilities.

 The researchers then made the suggested polymers and turned them into thin films.

The algorithm showed that despite the lack of a trial, those materials would exceed the performance of membranes currently used to separate carbon dioxide from methane.

Real-life tests showed that the polymers separated gases with a capability close to what the algorithm predicted.

 After the team created their algorithm to gauge a particular membranes likely ability to perform, they turned their attention to figuring out which chemical structures are the best to consider when creating gas-separation membranes.

Getting those insights should spur progress as scientists determine the most appropriate uses for these membranes to help the environment or for other reasons.

    Brian Benicewicz is a University of South Carolina chemistry professor who worked on the project.

While outlining the benefits of the new process, he said, “It removes the guesswork and the old trial-and-error work, which is very ineffective.

You dont have to make hundreds of different materials and test them.

Now youre letting the machine learn.

It can narrow your search.

“Benicewicz also explained that membrane design often becomes more challenging as developers try to strike a balance between permeability and selectivity.

The molecules in question are so tiny that if a membrane lets one gas through, it may not effectively filter out another.

 Another member of the research team compared this machine learning-driven method to Netflix helping people find the movies they like most.

Algorithms assess what a person watched before, then assign percentage scores to available titles to tell a viewer how appropriate of a match they are.

Here, machine learning examined characteristics of polymers proven effective for gas separation to increase the chances of good results when using a new polymer.

    Sanat K.

Kumar, a chemical engineering professor from Columbia University who worked on this project, believes this new application of big data could lead to better polymer designs.

Then, researchers may feel more compelled to look at materials they never considered before, possibly ending up with superior results.

 Kumar weighed in by saying, “This work thus points to a new way of materials design.

Rather than test all the materials that exist for a particular application, you look for the part of a material that best serves the need that you have.

 When you combine the very best materials, then you have a shot at designing a better material.

“People who participated in this work said that this process is commercially viable.

One of the near-term applications for separating carbon dioxide from methane with a membrane material discovered through this process is in the natural gas industry.

Carbon dioxide causes pipeline corrosion, providing a reason to efficiently and thoroughly remove it.

 Another possibility is using membranes to separate greenhouse gases from coal.

Doing that would have a positive effect on emissions in the environment.

 Its still too early to say whether this high-tech method of choosing membrane materials will become a widely used option.

No matter what, its an inspiring example of using machine learning to overcome obstacles and get impressive results.

   Bio: Devin Partida is a big data and technology writer, as well as the Editor-in-Chief of ReHack.


Related: var disqus_shortname = kdnuggets; (function() { var dsq = document.

createElement(script); dsq.

type = text/javascript; dsq.

async = true; dsq.

src = https://kdnuggets.



js; (document.

getElementsByTagName(head)[0] || document.


appendChild(dsq); })();.

Leave a Reply