Artificial intelligence (AI) is rapidly gaining ground as core business competency. Today’s machine learning (ML) or deep learning (DL) algorithms promise to revolutionize business models and processes, restructure workforces, and transform data infrastructures to enhance process efficiency and improve decision-making throughout the enterprise. Gone are the days of data silos and manual algorithms.
However, widespread belief by stating that AI’s growth was stunted in the past mainly due to the unavailability of large data sets. Big data changed all that – enabling businesses to take advantage of high-volume and high-velocity data to train AI algorithms for business-process improvements and enhanced decision making.
The Road to AI Leads through Information Architecture describes howhybrid Data Management, Data Governance, and business analytics can together transform enterprise-wide decision making. According to this author, these three core business practices can enable organizations of all sizes “to unleash the power of AI in the enterprise.”
The Role of Data Architecture in Unleashing the Power of Artificial Intelligence
William McKnight, the president McKnight Consulting Group, has said that that “Information Architecture” plays a key role in establishing order in the continuous evolution of emerging data technologies. McKnight discusses specific measures that organizations should take to embrace AI and streaming data technologies, and the long-range impact of General Data Protection Regulation (GDPR) on enterprise Data Management practices. He recognizes that while streaming data is the only way to deal with the high velocity of big data, strong Data Governance measures will ensure GDPR compliance.
Recently, the umbrella field of AI has gained traction because of the innovative IT solutions enabled by machine learning or deep learning technologies. The terms “intelligent” or “smart” associated with any IT system specifically point toward the ML or Dl capabilities of such systems.W
Well-managed Data Architecture and AI technologies are poised to drive future innovations in IT, which will bring in better opportunities for businesses through technological disruptions. However, these trends also indicate that the businesses will need highly capable Data Science field experts, groomed in AI, predictive modeling, ML, and DL, among other skills, to drive this transformative tech leadership.
A DATAVERSITY®webinar points out that all core Data Management technologies like artificial intelligence, machine learning, or big data Require a sound Data Architecture with data storage and Data Governance best practices in place. This webinar discusses how the latest Data Architecture Trends support organizational goals. Tomorrow’s data technology expert will be responsible for implementing and sustaining a Data Strategy and will be expected to handle the risks and the newer profit opportunities with equal finesse.
But what kind of data infrastructure will allow that to happen? A well-defined and structured Data Architecture that accommodates big data, IoT, and AI while complying with all the applicable GDPR regulations.
Cloud: The Present and Future Savior of Enterprise Analytics
As businesses increasingly begin to rely on data and analytics for competing, Data Architecture is beginning to assume larger roles in the enterprise. In the era of digital businesses, the new norm for Data Architecture is a dynamic and scalable model that is, to some extent, met by public cloud. The latest analytics requirement is to process data at the source, thus allowing AI-based analytics across the data center network to the edge of the enterprise, as discussed in How to Create Cloud-Based Data Architectures
The direct benefits of cloud infrastructure in the management and delivery of data-driven, actionable intelligence. The analytics everywhere trend, which is gaining momentum, will drive the change from on-premise or hosted analytics to the edge computing era, where business analytics will happen in real time, and much closer to the source of data.
In the IoT Age, businesses cannot afford to lose valuable time and money in collecting and depositing the incoming data to a far-away location. Analytics will happen at the edge of businesses, which signals the next phase of cloud computing. The cloud-first strategy is already here with more and more organizations adopting the cloud. So, what’s next for analytics? Edge computing? Serverless computing?
If Data Architectures are robust enough, analytics will have the potential to go “viral,” both within and outside the organization. In that scenario, even citizen data scientists will be able to conduct self-service analytics at the point of data ingestion.
Human-Centric AI System Designs: A Panacea?
Andrew Ng recommends AI be adopted as an enterprise-wide decision-making strategy. As artificial intelligence technologies enable accurate forecasting techniques, enhanced process management through automation, and higher performance metrics for the whole organization, businesses that choose to ignore AI will be left behind. Machine learning, deep learning, human-machine interactions, and autonomous systems can jointly deliver results unmatched by any other business system.
Artificial Intelligence for Data-Driven Disruption discusses the power of an “AI-powered engine” to deliver real-time insights for managerial decision-making. With the ever-rising volume, variety, and velocity of business data, every business user from the citizen data scientist to the seasoned data stewards will need quick and timely access to data.
Living in the smart-systems era, the humans cannot overlook the fact that even AI algorithms can fail to deliver results if not implemented or adapted properly in the human work environments. The AI algorithms used today are similar to the ones used many years ago, but the computers or processors have become faster and more powerful.
While it is widely acknowledged that advanced artificial intelligence can automate many rote human tasks and can even “think” in limited cases, AI systems have not really passed “disaster situations” as in the case of self-driving cars or natural-calamity predictions. Thus, while AI algorithms can be extensively trained with the use of data to emulate human thinking to an extent, AI researchers have still not been able to establish the human-cognitive abilities of a robot or a smart machine.
The most fundamental difference is that the human brain can respond to original situations while the machine brain can only adopt second-hand situations transmitted through human-experience data, as explained in Smarter Together: Why Artificial Intelligence Needs Human-Centered Design.
Future algorithms can be trained to emulate human-cognitive capabilities. But while humans can err due to overconfidence, machine intelligence strictly relies on a study and application of data-driven facts. Even a bad algorithm can improve human thinking, thus according to “Kasparov’s law,” the process has to be improved to enable the best possible human-machine collaboration.
The artificial intelligence algorithms of the future should be designed from a human point of view, to reflect the actual business environment and information goals of the decision-maker. The AI software engineer is the person in a Data Science team who plays the critical role of bridging the gap between data scientists and data architects.
Architectural Requirements of Machine Learning-Driven Artificial Intelligence Systems
In machine learning, data is both the teacher and the trainer that shapes the algorithm in a specific way without any programming. Thus, data preparation for ML pipelines can be challenging if the Data Architectures have not been refined enough to interoperate with the underlying analytic platforms.
Machine learning is best-suited for high-volume and high-velocity data. The Data Architecture layer in an end-to-end analytics sub system must support the data preparation requirements for machine learning algorithms to work. A dedicated development life cycle supporting ML learning models has to be available, and the ML platform must support several ML frameworks for custom solutions from commercial vendors. The public cloud is a great storage and compute environment for ML systems simply because of its architectural elasticity.
An organization can only take advantage of this huge mass of data from many different sources if a sound Data Architecture (data as an enterprise layer) is in place across the organization and if end-to-end AI-powered Analytics systems have been deployed to empower all types of business users to engage in just-in-time analytics and BI activities.
The Future of AI and Data Architecture
In the coming years, as information derived from “data” becomes a corporate asset with high revenue potentials, organizations will become more disciplined about monetizing and measuring the impact of data like the other KPIs.
Gartner states that by 2021, data centers will have to integrate AI capabilities in their architectures. Make Room for AI Applications in the Data Center Architecture predicts that AI applications will penetrate every vertical in the near future, so it makes sense to adopt artificial intelligence, machine learning, and deep learning practices in the data centers. As these technologies will challenge existing data storage technologies, newer and better platforms like the edge or serverless may be the answer.
This article has been published from a wire agency feed without modifications to the text. Only the headline has been changed.