What Is the Most Important Question for Data Science (and Digital Transformation)?
With so many buzzwords surrounding artificial intelligence and machine learning, understanding which can bring business value and which are best left in the laboratory to mature is difficult.
Top 10 Technology Trends of 2019
This article outlines 10 top trending technologies for 2019, a list that covers diverse topics such as security, the Internet of things, reinforcement learning, energy sustainability, and smart cities.
This AI Researcher Is Trying To Ward Off a Reproducibility Crisis
Joelle Pineau, a machine-learning scientist at McGill University, is leading an effort to encourage artificial-intelligence researchers to open up their code.
Explainability: Cracking Open the Black Box
What is explainability in artificial intelligence, and how can we leverage different techniques to open the black box of AI and peek inside? This practical guide offers a review and critique of the various techniques of interpretability.
AI Index 2019 Assesses Global AI Research, Investment, and Impact
Leaders in the AI community came together to release the 2019 AI Index report, an annual attempt to examine the biggest trends shaping the AI industry, breakthrough research, and AI’s impact to society.
Deep Learning Has a Size Problem
To make sure deep learning meets its promise, we need to reorient research away from state-of-the-art accuracy and toward state-of-the-art efficiency. We need to ask if models enable the largest number of people to iterate as fast as possible using the fewest amount of resources on the most devices.
An Epidemic of AI Misinformation
The media is often tempted to report each tiny new advance in a field, be it artificial intelligence or nanotechnology, as a great triumph that will soon fundamentally alter our world. Occasionally, of course, new discoveries are underreported.
From Blockbuster to Blockchain: How To Manage a Not-Quite-Settled Technology Platform
Rather than waiting for a dominant player to emerge, the oil and gas industry is at a point in its journey with blockchain where it makes sense to actively start working toward common architectures and standards in the blockchain domain.
Computers Evolve a New Path Toward Human Intelligence
Neural networks that borrow strategies from biology are making profound leaps in their abilities. Is ignoring a goal the best way to make truly intelligent machines?
Google’s Self-Proclaimed Quantum Supremacy and Its Effect on Artificial Intelligence
When Google claimed quantum supremacy, IBM challenged it. Nonetheless, the development is really important for the future of artificial intelligence.
Quantum Supremacy Using a Programmable Superconducting Processor
Physicists have been talking about the power of quantum computing for more than 30 years, but the questions have always been: Will it ever do something useful, and is it worth investing in?
An AI Pioneer Wants His Algorithms To Understand the "Why"
Deep learning is good at finding patterns in reams of data but can't explain how they're connected. Turing Award winner Yoshua Bengio wants to change that.
Olis Robotics, iCsys Form Partnership for New Machine-Learning ROV Controller
The controller from Olis will be distributed and supported by iCsys and is expected to increase efficiency and decease costs.
Why It’s Time To Start Talking About Blockchain Ethics
Blockchain technology is changing the nature of money and organizations. We should probably start pondering the potential consequences.
When Is a Neural Net Too Big for Production?
Here are some thoughts on recent discussions around natural-language-processing transformer models being too big to put into production and a dive into how they have been shipped at Monzo using the HuggingFace library.
The Hidden Risk of AI and Big Data
With recent advances in AI being enabled through access to so much big data and cheap computing power, there is incredible momentum in the field. Can big data really deliver on all this hype, and what can go wrong?
Dremio Releases Data Lake Engines for AWS and Azure
The company has released new technology for cloud data lakes. Can the company take data in relatively slow cloud object storage and make queries against it faster?
Researchers Demonstrate All-Optical Neural Network for Deep Learning
Even the most powerful computers are still no match for the human brain when it comes to pattern recognition, risk management, and other similarly complex tasks. A new approach, however, could enable parallel computation with light, simulating the way neurons respond in the human brain.
How Is Industrial Augmented Reality Taking Form?
Industrial augmented reality (AR) takes several forms. The holy grail—and eventually prevalent format—is headworn AR, where visualization software is installed on AR glasses such as Microsoft’s HoloLens. Nearer-term deployments also include smartphone- or tablet-based AR.
Upstream Sector Emerges as the Epicenter for Industrial Internet Adoption
The upstream sector is witnessing comparatively more implementation of the industrial Internet of things compared with other sectors of the oil and gas industry. This is driven by the need to reduce risk and maximize returns through digitalization, according to data and analytics company GlobalData.
Decentralized and Collaborative AI: How Microsoft Research Is Using Blockchains To Build More-Transparent Machine-Learning Models
Recently, AI researchers from Microsoft open-sourced the Decentralized & Collaborative AI on Blockchain project that enables the implementation of decentralized machine-learning models based on blockchain technologies.
Column: The Death of Big Data and the Emergence of the Multicloud Era
The era of Big Data is coming to an end as the focus shifts from how we collect data to processing that data in real-time. Big Data is now a business asset supporting the next eras of multicloud support, machine learning, and real-time analytics.
Open-Source Tool Kit Uses 3D Printing for Micromodel Generation
In this paper, the authors present an open-source tool kit for the generation of microfabricated transparent models of porous media (micromodels) from image data sets using optically transparent 3D polymer additive manufacturing (3D printing or sintering).
Quantum Computing: The Next Big Thing for Oil Exploration?
Quantum computers exploit the peculiar behavior of objects at the atomic scale and use the qubit as the basic unit of quantum computing. A quantum computer with only 100 qubits would, theoretically, be more powerful than all the supercomputers on the planet combined.
Training a Single Artificial-Intelligence Model Can Emit as Much Carbon as Five Cars in Their Lifetimes
Researchers at the University of Massachusetts, Amherst, performed a life-cycle assessment for training several common large AI models. They found that the process can emit more than 626,000 lbm of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car.
Strengthening the Energy Sector’s Cyber Preparedness
The reality is that threats continue to outrun the sector’s security evolution, primarily because organizations are increasingly connecting operational technology, such as supervisory control and data acquisition systems and industrial control systems, to their information technology networks.
AI-Generating Algorithms: An Alternate Paradigm for Producing General Artificial Intelligence
This paper describes a path to general artificial intelligence (AI) (i.e., AI that is as smart or smarter than humans) based on the trend in machine learning that hand-designed solutions eventually are replaced by more-effective, learned solutions.
No Cloud Required: Why AI’s Future Is at the Edge
The algorithms for running AI applications have been so big that they’ve required powerful machines in the cloud and data centers, making many applications less useful on smartphones and other edge devices. Now, that concern is quickly melting away, thanks to a series of recent breakthroughs.
Integrated Internet of Things Platform Helps Close the Gap Between Data Science and Operations
Arundo Analytics has built an integrated industrial Internet of things platform that allows data scientists to productize data-science solutions and accelerate feedback/improvement iterations between end-users and data scientists effectively.
Microsoft Launches Drag-and-Drop Machine-Learning Tool
Microsoft announced three new services that aim to simplify the process of machine learning—an interface for a tool that automates the process of creating models; a new no-code visual interface for building, training, and deploying models; and hosted Jupyter-style notebooks for advanced users.
Researchers Want To Study AI the Same Way Social Scientists Study Humans
Maybe we don’t need to look inside the black box after all. Maybe we just need to watch how machines behave, instead.
Neural Networks Plus CFD Speed Up Simulation of Fluid Flow
High-fidelity 3D engineering simulations are valuable in making decisions, but they can be cost-prohibitive and require significant amounts of time to execute. The integration of deep-learning neural networks with computational fluid dynamics may help accelerate the simulation process.
DeepMind and Google: The Battle To Control Artificial Intelligence
AGI stands for artificial general intelligence, a hypothetical computer program that can perform intellectual tasks as well as, or better than, a human. AGI will make today’s most advanced AIs look like pocket calculators.
Harnessing Organizational Knowledge for Machine Learning
In collaboration with Stanford University and Brown University, Google explores how existing knowledge in an organization can be used as noisier, higher-level supervision—or, as it is often termed, weak supervision—to quickly label large training data sets.
Collaboration Redefines the Human/Robot Relationship
Robots have been a part of industrial production for decades, but the interface between humans and robots has changed as automation technologies increased in complexity, scope, and scale. Once a novelty, collaborative robots are projected to become a significant element of the automation landscape.
Foundations Built for a General Theory of Neural Networks
Neural networks can be as unpredictable as they are powerful. Now mathematicians are beginning to reveal how a neural network’s form will influence its function.
How AI Can Help Solve Some of Humanity’s Greatest Challenges—and Why We Might Fail
In 2015, the United Nations ratified the 2030 Sustainable Development Goals. Technology will be critical in the pursuit of these ambitious targets, but the pace and scale of change creates risks that humanity must take very seriously.
Analysis of 16,625 Papers Points to the Future of AI
A study of 25 years of artificial-intelligence research suggests the era of deep learning may come to an end.
Industrial Internet of Things Can Improve Profitability and Integrity Management
For too long, owner/operators have managed operational profitability using paper-based processes or monthly reporting cycles. In the new technological climate, this approach has been proven to be less effective. Enter the industrial Internet of things.
Houston Startup Creates the Alexa or Siri for Oil and Gas Companies
Nesh's digital assistant technology wants to make industry information more easily accessible for energy professionals.
Greedy Pursuit: Algorithms Show Promise in Measuring Multiphase Flow
“Greedy pursuit” in the realm of algorithms is a good thing. Saudi Aramco studied such algorithms to produce images simulating the flow inside a pipe’s cross section, possibly reducing the need for separator-based multiphase flowmeters.
How Algorithms Are Taking Over Big Oil
Artificial intelligence has come to the oil patch, accelerating a technical change that is transforming the conditions for the oil and gas industry’s 150,000 US workers.
Predictive Analytics Will Help Oil Companies Forecast the Future
The ability to predict the future to optimize operations has been the aim of oil and gas companies for some time. Could that time finally be here?
Radical New Neural Network Design Could Overcome Big Challenges in AI
Researchers borrowed equations from calculus to redesign the core machinery of deep learning so it can model continuous processes like changes in health.
Don't miss out on the latest technology delivered to your email monthly. Sign up for the Data Science and Digital Engineering newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
24 January 2020
16 January 2020
16 January 2020