Industrial Internet Consortium Creates Vision on AI for Industrial IoT

The Industrial Internet Consortium® (IIC™), which works to accelerate the adoption of the Industrial Internet of Things (IIoT), has announced it is leading the vision for industrial artificial intelligence (AI) with new bodies of work and will be sharing insights at the IoT Solutions World Congress (IOTSWC). Building on the successful Industrial IoT Analytics Framework Technical Report (IIAF), which […]

The Industrial Internet Consortium® (IIC™), which works to accelerate the adoption of the Industrial Internet of Things (IIoT), has announced it is leading the vision for industrial artificial intelligence (AI) with new bodies of work and will be sharing insights at the IoT Solutions World Congress (IOTSWC).

Building on the successful Industrial IoT Analytics Framework Technical Report (IIAF), which offered industrial analytics guidance in evolving fields such as big data, AI and machine learning, IIC is expanding its program of work to focus on AI in industrial IoT applications. The Industrial Analytics Task Group has been expanded and renamed to Industrial AI Task Group.

“No single technology has the potential to change how we do business in virtually every sector of the economy more than artificial intelligence,” said Wael William Diab, Industrial Artificial Intelligence Task Group Chair, Secretary of the IIC Steering Committee, IOTSWC 2018 AI Forum Chair and Senior Director for Huawei. “To achieve transformative business outcomes, IT and OT stakeholders will have to work closely together on the integration of AI into industrial applications.”

The Industrial AI Task Group recently hosted a workshop at the IIC quarterly meeting in Helsinki this past May to promote IIC’s vision of AI. In a panel session moderated by Diab, four IIC experts talked about the importance of AI in industrial applications. The panel was well received and attended by over 100 IIC members at the IIC quarterly meeting in Helsinki. Here are some of panelists’ remarks:

“AI is still in its infancy but with great potential in industrial applications,” said Shi-Wan Lin, Co-Chair of the IIC Technology Working Group and CEO & Co-Founder of Thingswise. “There’s still lots of work to be done but applying AI in the vast amount of industrial data will enable IIoT systems to create higher value in the years to come.”

“While AI toolsets will help solve customer’s problems, you also need good, clean consistent data and analytics to put it into action,” said Christopher Ganz, IIC Steering Committee Member and Group Vice President for Service Research & Development at ABB.

“Custom silicon is very important for AI,” said Liang Guang, IIC Member and Senior Standards Manager at Huawei Technologies. “While some of the big vendors have used custom hardware designs, heterogeneous accelerators can offer the performance needed for AI on the cloud.”

“There are many industries advancing AI,” said Christoph Fritsch, IIC Member and Senior Director of Industrial, Scientific and Medical Markets at Xilinx. “We’ll continue to see the number of applications that use AI grow in a wide variety of industries in the next few years.”

This October, IOTSWC will launch its inaugural AI & Cognitive Systems Forum. The forum will explore the various aspects of AI from a foundational technology perspective to transformational use cases that open up endless business possibilities in IIoT. In addition, it will showcase how AI-enabled IIoT solutions can provide enhanced insights, complex decision making, self-learning and self-healing in industrial environments.

A workshop, called Deciphering AI, will precede the forum serving as an “AI 101” with the goal of providing both background to the attendees on AI as well as the key concerns and challenges with deploying AI. The IOTSWC AI & Cognitive Systems Forum as well as the Deciphering AI Workshop will be chaired by Diab and IIC Member Edy Liongosari, Chief Research Scientist at Accenture Labs, with contributions from and sessions led by several IIC members. Diab and Liongosari are also members of the IOTSWC Program Committee.

For more information, visit the Industrial Internet Consortium.

How New Business Models Are Combining the IoT and Services

By Timothy Chou, IoT Lecturer, Stanford University If you work for a company that produces such items as heating, ventilation and air conditioners, 3D cardiac imaging machines, seed drills or submersible pumps, chances are that you’re already aware of the rise of the Internet of Things (IoT). All of us in the tech community are […]

By Timothy Chou, IoT Lecturer, Stanford University

If you work for a company that produces such items as heating, ventilation and air conditioners, 3D cardiac imaging machines, seed drills or submersible pumps, chances are that you’re already aware of the rise of the Internet of Things (IoT). All of us in the tech community are constantly excited about any opportunity to tell the rest of you about our cool technology that can run on your machine, vehicle or product. We see the IoT’s potential to not only improve services but also to connect each product to the internet, to collect data from it and, using advanced machine learning technology, to derive predictions from the ongoing analysis of that data.

But if you’re the CEO of an already successful manufacturing company, maybe the questions you’re asking right now are “Why should I care? Isn’t the IoT just the stuff my geeky R&D staff care about? Don’t my products sell just fine without IoT connectivity? How can it possibly be meaningful to my business?”

In this article, I’ll be making the case that with IoT, you can not only double the size of your business but also create a barrier that your competition will find difficult to cross. I’ll cover three basic business models and discuss the organisational implications of IoT, as well as some of the objections you’ll encounter. Now that you’re ready to lead the digital transformation of your company, here’s a roadmap to point you in the right direction.

Software-defined machines

Out in the physical world, next-gen products are increasingly being powered by software. The new £68,000 Porsche Panamera is controlled by a 100 million lines of code, up from only two million lines in the previous generation. Tesla owners have come to expect new features delivered to their vehicles through software updates, not by a mechanic.

Healthcare machines are also becoming more software-defined.  A drug infusion pump may run on more than 200,000 lines of code, an MRI scanner seven million. On construction sites, a modern boom lift has 40 sensors and 3 million lines of code, while on the farm, a combine harvester has over 5 million. We can debate whether size is a worthwhile measure of the value of software but you get the point. Software is starting to define physical machines.

Since machines are now becoming more software-defined, maybe the business models that once applied only to software will start to apply more and more to the world of physical products. Let’s see how well this theory stands up to scrutiny.

Business model 1: Product & disconnected services

In the early days of the software industry, we sold them on a CD so, if you wanted to use the newest version, you’d have to go out and buy a copy. As software products became more complex, companies like Oracle moved to a business model where you bought each product (ERP or database) together with a service contract. Over time, this service contract became the largest and most profitable component of many enterprise software product companies. In the year before Oracle bought Sun (when they were still a pure software business) they had revenues of approximately $15bn but only $3bn was product revenue. The remaining $12bn – over 80% of their revenue – was high margin, recurring service contracts.

But what is service? Is it answering the phone nicely from Bangalore? Is it flipping burgers at McDonald’s? No. Service is the delivery of information that is personal and relevant to you. That could be the hotel concierge giving you directions to the best Szechuan Chinese restaurant in town, or your doctor telling you that based on your genome and lifestyle, you should be on a specific medication. Service is personal and relevant information.

I’ve heard many executives of product manufacturing companies say, “Our customers won’t pay for service”. Well of course, if you think that service is just fixing broken things, then your customers will think you should be building a more reliable product. Remember that Oracle service revenue. In 2004, the Oracle Support organisation studied 100m support requests and found that over 99.9% of them had been answered with already known information. Aggregating information for thousands of different uses of the software, even in a disconnected state, represented huge value over the knowledge of a single person in a single location. Real service is not breaking support but rather information about how to maintain or optimise the availability, performance or security of the product.

You might wonder why, in America, General Electric bothers to run ads about the industrial internet during Saturday Night Live commercial breaks. All you need to do is download their 201610-K and look on page 36. Out of $113bn in revenue, GE recognised that $52bn, or nearly half of it, came from service revenue. So imagine if your business could move to 80% service revenue. Not only would you be tens of billions of dollars larger, overall margins could also easily double. And let me remind you this is all being done without connecting your products as IoT devices. If you’re currently the CEO of a power, transportation, construction, agriculture, oil & gas, life science or healthcare machine company, ask yourself this question – how big is your service business?

Business model 2: Product & connected services

If they could connect to their own IoT- enabled products, any business could make the service they provide even more personal and relevant. Many software and hardware product companies already connect to their products to provide assisted services, which help both the manufacturers and users maintain or optimise the security, availability and performance of these products.

Now let’s shift that concept to the world of physical products. If I know both the model number and current configuration of a specific machine as well as the time-series data coming from its hundreds of sensors, then the service I provide can be even more personal and relevant.

I can provide precision assistance for the users who maintain or optimise the performance, availability and security of that machine. If in business model 1 we charge 0.5-1% of product purchase price per month, then in model 2, I could charge an additional 1-2% for an improved service. At scale, these small margins can be significant. Consider a product which sells 4,000 units at $200K each. At just 1% per month, this product could generate $100m of high margin, recurring revenue for the manufacturer.

Read the source article at Disruption.

Executive Interview: Richard Soley, CEO of OMG

Industrial Internet Consortium Members “Invent the Future” by Anticipating Disruptions from AI, Pursue Standard Testbeds to Accelerate Adoption Dr. Richard Soley is Chairman and CEO of the Object Management Group (OMG) and Executive Director of the Industrial Internet Consortium (IIC). In a career in the computer industry approaching 40 years, he has overseen a range […]

Industrial Internet Consortium Members

“Invent the Future” by Anticipating Disruptions from AI,

Pursue Standard Testbeds to Accelerate Adoption

Dr. Richard Soley is Chairman and CEO of the Object Management Group (OMG) and Executive Director of the Industrial Internet Consortium (IIC). In a career in the computer industry approaching 40 years, he has overseen a range of software standard collaboration efforts. These include the CORBA specification, the Unified Modeling Language (UML) and Model Driven Architecture (MDA), which permeate critical software today. He began his professional career at Honeywell Computer Systems working on the Multics operating systems. He was later co-founder and CEO of A.I. Architects, maker of the 386 HummingBoard. Dr. Soley holds bachelor’s, master’s and doctoral degrees in Computer Science and Engineering from MIT. He recently spoke with AI Trends Editor John P. Desmond.

Q. Please describe the mission of the IIC today and give us your historical perspective on AI?

A. The IIC mission continues to focus on industrial applications and market verticals including smart cities, transportation, and agriculture among others. What I find interesting is this sudden recognition that artificial intelligence (AI), machine learning, and even 3D printing, all help in the adoption of IoT and industrial IoT systems.

I was CEO of AI Architects in the mid to late ’80s and worked for Symbolics, which was an AI company making expert system tools and AI hardware. This was around the same time when “Time” magazine announced its 1984 Man of the Year award as the computer, which I believe generated the AI Winter. The magazine created the expectation that machines think. I remember computer companies like Thinking Machines Corporation, whose motto was, “We want to make a computer that’s proud of us,” and that was never the function of AI. Expectations were way over-hyped and couldn’t possibly be met.

The ubiquitous access today to a large number of connected cloud computers that have far more compute power, far more memory, and access to open source software for data analysis, that’s something that we couldn’t even dream of in 1982. It created an opportunity to put more intelligence in systems and bring us to where we are today. It remains overhyped, because that’s the nature of the IT industry, but it will deliver a lot more. Voice recognition experienced this struggle in 1982. And now, we all carry a voice recognition system in our pockets, whether we call it Siri or Google Assistant.

Q. Can you share how IIC is organized and funded?

A. The Industrial Internet Consortium (IIC) is a major part of the Object Management Group, which is, itself, nearly a 30-year old standards organization. The IIC is accelerating the adoption of industrial IoT by building testbeds. That requires agreement on a shared architecture, agreed security framework, analytics framework, vocabulary, and so forth.

The test beds are IIC’s major differentiator. The largest funding source for IIC is membership with the balance coming mostly from events. Funding of the IIC parallels the funding of OMG.

Q. What are some of the most important IIC initiatives today?

A. The most important initiatives are the test beds run by our nearly 300 member companies. Roughly 30 testbeds are currently running and I’ll give examples of a few compelling ones.

The Track and Trace testbed was initiated by Bosch Software Innovations in Germany. It started with a very simple idea that factories could be made more efficient and safer if you know where everything in the factory was located within a meter. The 3-year old testbed utilizes Cisco Wi-Fi routers to triangulate position of things – people, parts, works-in-progress, and tools – inside the factory and overhead cameras to provide about five centimeters of resolution. Results from this testbed started publishing last year, which are informing the requirements for new standards, new concepts of training, retraining, and hiring. It’s a great opportunity to learn more about IoT in manufacturing.

But we’re not limited to manufacturing. Our Infinite Testbed is managed by Dell Technologies in Southern Island, County Cork, Ireland. In over two and a half years of use, they have integrated national and provincial information resources to optimize ambulance delivery and information delivered to and from ambulances to assist in saving lives.

The Smart Building Testbed is a project between Dell and Toshiba in Yokohama, Japan. The companies outfitted a brand-new building with 35,000 sensors collecting between 0.5-1.0 terabyte of data every day. The sensor data collects information about light, temperature, people movement, telephone calls, and more. And it’s learning how the building is used, which enables optimization for the comfort of the users and predictive maintenance of the building.

Major testbed results are being published in the IIC “Journal of Innovation.”

Q. You’ve been involved in several collaborative efforts to define computing standards, including CORBA, Unified Modeling Language and Model-Driven Architecture. How effective has the effort to collaborate on software standards been?

A. Very effective. There are about five billion corporate systems running today, including every smartphone in the world, every telco switch, every banking system, every robotic system, and so forth. Our oldest standard, now more than 28 years old, is the corporate standard Common Object Request Broker Architecture (CORBA), built into every Java virtual machine.

According to Gartner, 71% of all software development organizations use UML today. Our model-driven approach to building software has been extremely effective.

We’re now in about two dozen vertical markets. And OMG standards drive every retail point of sale, every NATO military radio, and every middleware system. And we have new standards coming out in CubeSats, (a miniature model for space research). We’ve run our standards process over a thousand times, and all those standards are implemented. Implementation is a requirement of our standards process.

Q.Could you contrast the Industrial IoT opportunity with its Consumer IoT counterpart?

A. First of all, there’s plenty of good work going on in the consumer space and IIC does not need to get involved in it. We see bigger opportunities in the industrial space, where IoT can have huge, disruptive effects on markets including agriculture, healthcare, transportation, smart cities, manufacturing, and production. We’re trying to learn what that disruption will be.

And, in terms of the technology, standards and security are mission-critical in the industrial space. If you hit a switch and nothing goes on or off in your house, you just hit the switch again. That’s not the end of the world. But if the factory stops working for a couple of days, that might be the business equivalent to the end of the world.

Q. How does IIC differentiate the testbeds from a use case?

A. A use case is a use of technology. One of the limitations with standards and with test beds is a focus on the technology, instead of the application of that technology. Some of the testbeds that we’ve developed at the IIC, such as the Time Sensitive Network (TSN) Testbed, are necessarily focused on technology availability, integration, and portability.

The use case approach is driven by understanding the desired outcome as opposed to what technology is currently available. For example, the world’s largest copper mining company had a unique need for high-reliability wireless networks to make mining operations safer. As a result, we’re putting together a stack of technology from our members to deliver industry-specific requirements.

Q. Looking ahead, what does the intersection of AI and industrial IoT suggest for innovation?

A. Without question, AI and IoT are going to disrupt enormous existing markets, including transportation, manufacturing, and healthcare. With access to more intelligence, the huge amounts of data generated by industrial IoT can actually be analyzed, generating insights that lead us to better efficiencies, better productivity, and improved safety.

When you have systems that can ingest massive amounts of unstructured data and generate new insights, you are outperforming the human capacity. So, we’re going to see huge disruptions driven by the combination of AI and IoT.

For more information, go to the Industrial Internet Consortium.

Industrial IoT Analytics Moving Into Prime Time

Implementing an Internet of Things (IoT) program isn’t exactly like flipping a switch. There’s a lot involved, from sensors where the data is initially collected to the network the data travels to the analytics systems that figure out what it all means. So while we’ve all been talking about IoT for a few years now, […]

Implementing an Internet of Things (IoT) program isn’t exactly like flipping a switch. There’s a lot involved, from sensors where the data is initially collected to the network the data travels to the analytics systems that figure out what it all means. So while we’ve all been talking about IoT for a few years now, it’s still considered an emerging technology. But that might be about to change.

Forrester Research has predicted that 2018 is the year that IoT will move from experimentation to business scale.

Heeding the call, a few analytics vendors are getting on the bandwagon with formal divisions or product offerings.  For instance, in January, SAS announced a new IoT division. And this week Splunk announced its first technology specifically for the IoT market, Splunk Industrial Asset Intelligence. The solution is designed to help organizations in manufacturing, oil and gas, transportation, energy and utilities, to monitor and analyze industrial IoT data in real time to create a simple view of complex industrial systems while helping to minimize asset downtime, according to the company’s formal announcement.

As a specialist in machine data analytics, IoT was a natural extension for Splunk. IDC Analyst and Program VP Maureen Fleming told InformationWeek in an interview that it is also something Splunk’s customers have been requesting. Those customers were already trying to solve some of their IoT challenges using Splunk’s existing offerings, she told me. The existing customer need, along with Splunk’s existing expertise in machine data analytics really converged to drive Splunk’s IoT launch right now, Fleming said.

Wind, Power, and Data

One of those customers is Australia’s Infigen Energy which develops, owns, and operates wind farms to power businesses in the country. Information and Application Architect Victor Sanchez told InformationWeek that Splunk IAI has made it easier to troubleshoot issues with the Infigen’s legacy automated SCADA control system.

The company first deployed Splunk Enterprise in a pilot in 2014, and since then the system has evolved from simple monitoring to a platform for ingesting data from all the company’s turbines and other equipment. Now Infigen is starting to build its first machine learning models and correlating more data from different levels of the business, from technical to operational, Sanchez said.

Infigen’s implementation uses a translation box deploying Kepware with Splunk’s Industrial Data forwarder enable the ingestion of industrial data into Splunk. Sanchez said the company is expanding the size of its on-premises distributed Splunk cluster to future-proof the system and prevent bottlenecks.

The system has provided better visibility of this key data to all employees. The company also uses the mobile Splunk app to enable alerts and data access on the go.

Seema Haji joined Splunk about 10 months ago to help the company launch the technology as director of product marketing for IoT and business analytics. She told InformationWeek in an interview that many of Splunk’s existing customers have been looking for a way to work with their IoT data. Many had been using Excel to bring data in and analyze it. Splunk has set these customers up with a limited availability version of the new IoT analytics offering.

Read the source article in InformationWeek.com.

How Blockchain Technology and Cognitive Computing Work Together

When it comes to revolutionary technology, the blockchain and cognitive computing are two at the top of the list in 2018. With these technologies finally being put to use in practical applications, we’re learning more and more about what they can do on their own—and together. Let’s take a look at how some industries can […]

When it comes to revolutionary technology, the blockchain and cognitive computing are two at the top of the list in 2018. With these technologies finally being put to use in practical applications, we’re learning more and more about what they can do on their own—and together. Let’s take a look at how some industries can take advantage of this powerful combination.

Before we can discuss what these two technologies can accomplish together, it’s important to understand them separately.

Cognitive computing is essentially using advanced artificial intelligence systems to create a “thinking” computer. Deep learning allows cognitive computers to learn and adapt as they receive new data, and they do not simply execute logic-based commands as computers have traditionally done. Because the technology is evolving and encompasses many different AI systems, there is no standardized definition for these systems. However, the term is best used to describe computer systems that mimic the human brain.

The blockchain, a new system for storing information and processing transactions, was created for the distribution of bitcoin, the world’s leading cryptocurrency. It’s different from most databases because it uses a distributed ledger system, rather than a centralized database. In basic terms, that means that the information is distributed in thousands of computer networks, instead of being stored in one location. The information is updated regularly, and everyone on the network can view it.

This makes the blockchain more secure than a traditional database since a hacker cannot compromise the whole system by breaching one computer. Today, the blockchain is becoming popular in some industries for its superior security. Cybersecurity is a growing concern, and the blockchain could be one way some industries can reduce the number of breaches.

Cognitive Computing and Blockchain – the “IoT Dream”

So how do these two technologies work together? Since we’ve only just scratched the surface on the capabilities of both the blockchain and cognitive computing, there’s still a lot of opportunity for bringing these technologies together. One of the largest areas for potential expansion is in tandem with IoT (Internet of Things) growth.

Many industries are beginning to see how using interconnected devices can help them automate and improve their processes, but there are currently limitations on scaling and security with centralized systems.

IBM, a leader in artificial intelligence, has already integrated its Watson supercomputer into a platform for IoT, allowing businesses to make better use of the data they collect using these devices. IoT devices collect the data, but the majority of this data is “dark”, meaning that it just sits in storage and isn’t used for anything. Cognitive computing has the ability to process this data in ways humans can’t—while gaining valuable insights that can be used in strategic planning and performance measurement.

So how does the blockchain fit into this equation? Mainly, as a way to scale IoT usage and for security purposes. IoT data can be extremely sensitive and valuable to businesses—the last thing a company wants is for a data breach to occur. Blockchain ledgers also create logs for context, which provide detailed information about anomalies and problems and break down exactly where and when these problems occurred.

  • By Sarah Daren, consultant

Read the source article at RTInsights.com.

How to Begin Integrating AI into Data Center Operations

Rich Rogers, a senior vice president of product and engineering at Hitachi Vantara, envisions a data center in which AI-driven management software (some or all of it cloud-based) will monitor and control IT and facilities infrastructure, as well as applications, seamlessly and completely across single or multiple sites. Compute, power, storage, networking and cooling operations will […]

Rich Rogers, a senior vice president of product and engineering at Hitachi Vantara, envisions a data center in which AI-driven management software (some or all of it cloud-based) will monitor and control IT and facilities infrastructure, as well as applications, seamlessly and completely across single or multiple sites. Compute, power, storage, networking and cooling operations will flex dynamically to achieve maximum efficiency, productivity and availability. Human operators, meanwhile, will be free to do what they do best: plan new capabilities and innovate improvements.

“IoT and AI will enable data center issues to be root-caused and resolved automatically by software,” Rogers said. Data center administrators will no longer be woken-up at night to troubleshoot outages. “Voice technologies will enable data center operators to monitor and manage their data centers from any location, be [they] at the grocery store, gym or living room couch,” he predicted. IT Infrastructure gear will be deployed and maintained autonomously. “You simply stock new compute nodes and disk drives and robotics [will] streamline the technology to the appropriate systems,” Rogers explained.

AI-driven automation’s long-term goal is to drive IT managed services toward zero downtime. “Over time we expect the traditional SLA model—99.xx availability, etc.—will have no meaning as the system is always on, compliant, secure, agile and flexible,” advised Satheesh Kumar, IBM’s vice president of hybrid services, AI platform.

Data center infrastructure management is currently highly reactive due to the unexpected arrival of disruptions and delays. AI aims to fix this. “As infrastructure becomes increasingly vital and complex, this resource-intensive approach won’t work,” observed Milan Shetti, general manager of Hewlett Packard Enterprise’s storage division. “It’s no longer acceptable to find out about a disruption after it has occurred or spend the resources to resolve them—that’s the opportunity for AI.”

A rapidly growing number of smart sensors are becoming available to receive data from various data center elements, relaying critical insights into mechanical, electrical and environmental conditions.

“This data can be then used by sophisticated algorithms to analyze any potential problems or anomalies in the whole system, and warn data center managers well in advance,” noted Param Vir Singh, associate professor of business technologies at Carnegie Mellon University’s Tepper School of Business.

Read the source article at InformationWeek.com.

San Jose Relying on Planning, Collaboration to be Smartest City

The San Jose Smart City Vision is a plan that uses technology and data-driven decision-making to promote safety, sustainability, economic opportunity and quality of life for its constituents. The California city’s endgame is to become the most innovative city in America by 2020. However, to get there, the city needs some internal planning — and […]

The San Jose Smart City Vision is a plan that uses technology and data-driven decision-making to promote safety, sustainability, economic opportunity and quality of life for its constituents. The California city’s endgame is to become the most innovative city in America by 2020.

However, to get there, the city needs some internal planning — and a little help from its friends.

Last year, San Jose established a new Office of Civic Innovation to implement its vision to become as safe, inclusive, user-friendly and sustainable as possible, as well as to demonstrate the possibilities of technology and innovation.

The office will oversee a number of projects, programs and opportunities related to the city’s goal of making the city more efficient and effective, such as public safety, demonstration projects, data analytics, sustainability and public-private collaborations.

San Jose smart city projects are being narrowed down by focusing on three questions:

  1. Is the problem causing a lot of people pain and annoyance?
  2. Is it something that is core to what the city should do?
  3. Is the problem amenable to solution at scale with either technology or process improvement?

“If the answers are yes, yes, yes, then the problem is something we want to address in our innovation portfolio,” said Kip Harkness, deputy city manager for Civic Innovation. “One of the projects at the top of the list is hiring. If we’re going to be a smart city — actually, we like to think of it as a ‘learning city’ — that is going to be powered by the people who work for us.”

Earlier this year, the John S. and James L. Knight Foundation awarded the city of San Jose $200,000 in funding to explore how to develop and implement smart technology “in responsible and equitable ways.” The award was part of a $1.2 million commitment from the Knight Foundation to help San Jose and other cities, including Akron, Ohio; Boston; Detroit; Miami; and Philadelphia, explore IoT applications in their respective cities.

For San Jose, smart city funding will be used to support IoT strategic planning to make better IoT investments, IoT infrastructure financing, smart technology assets regulation and how to create private sector partnerships to benefit citizens.

Read the source article at TechTarget.

DATA – Blue Ocean Shift Strategy (Boss)

BOSS – Blue Ocean Shift Strategy can actually help and create vision to focus on areas such as AI, blockchain for education, health & agriculture, create ecosystems using BigData analytics and IoT. To capture a quick snapshot of this strategy, cer…

BOSS – Blue Ocean Shift Strategy can actually help and create vision to focus on areas such as AI, blockchain for education, health & agriculture, create ecosystems using BigData analytics and IoT. To capture a quick snapshot of this strategy, certainly, Big Data appears to be most effective and efficient driver for Blue Ocean Strategy. Based on a limited set....

Flood Sensors Wade into Artificial Intelligence Across Iowa

Worried about the possibility of flooding near your home in Iowa? Soon, you can just ask Alexa. The move to merge flood sensor data with artificially intelligent chatbots marks the next generation of flood data analysis available all across Iowa. The project is being led by the Iowa Flood Center, based at the University of Iowa, […]

Worried about the possibility of flooding near your home in Iowa? Soon, you can just ask Alexa.

The move to merge flood sensor data with artificially intelligent chatbots marks the next generation of flood data analysis available all across Iowa. The project is being led by the Iowa Flood Center, based at the University of Iowa, which has a long history of studying the effects of rainfall and flooding in the state.

“Next-generation IFIS, [Iowa Flood Information System] Flood AI — an artificial intelligence system — will be launched in March with Siri-like capabilities on many communication platforms,” said Ibrahim Demir, a professor of civil and environmental engineering at the University of Iowa and the architect behind the Web-based flood alert and analysis system.

The Iowa Flood Center was formed by the state following “a major flood event” in 2008, primarily in eastern Iowa. The center deployed some 250 water level sensors, attached mostly to bridges, said Nathan Young, associate director of the Iowa Flood Center.

“There are a few kind of concentrated areas where we have some scientific questions we’re trying to answer. But generally, we try to deploy them in areas that are going to benefit communities, to help them better anticipate the severity of the flood, as it’s happening,” Young added.

The flood center receives about $1.2 million in annual funding from the state. The data collected by the sensors complements existing water data collection information by the U.S. Geological Survey.

“We’re providing a little bit less information, but we’re able to provide more sensors, and distribute them more broadly, to provide information to small communities as well as large communities in the state,” said Young.

Getting flood and rainfall data into the hands of local elected officials, emergency planners and average citizens is the flood center’s primary goal, which is why officials stress that it should be as user-friendly as possible.

“Really, the general public is our target audience, so we try to minimize the technical detail, and the technical jargon and try to make it usable for everyone,” said Young.

Read the source article at FutureStructure.

From 2017 AI World: Real-Time IIoT in Action at Smart Community Center; AI driving chip architecture at the edge

A smart building in Kawasaki, Japan called the Smart Community Center, has 35,000 sensor devices in it, possibly the premiere example today of the Industrial Internet of Things (IIoT) in action. Announced in later 2016, the partnership of Dell EMC and Toshiba is developing a testbed to make sense of the data from the many […]

A smart building in Kawasaki, Japan called the Smart Community Center, has 35,000 sensor devices in it, possibly the premiere example today of the Industrial Internet of Things (IIoT) in action. Announced in later 2016, the partnership of Dell EMC and Toshiba is developing a testbed to make sense of the data from the many sensors. The Industrial Internet Consortium (IIC), a membership program dedicated to accelerating the Industrial Internet of Things (IIoT), approved the testbed, the first deep learning platform it has approved.

Smart buildings aim to lower the cost of maintenance and operation, and keep tenants happier with fine-tuned heating and lighting for instance. The Smart Community Center generates 300 million data points per day, said Richard Soley, chairman and CEO of the IIC and the Object Management Group, in an interview with AI Trends. “Working with that much data is a big deal,” he said.

It potentially could predict failure of a key component before it happens, based on the maintenance history, now available to the system. Replace the weak component before it fails, lessen disruption and hold down overall costs.  

Dell EMC is putting substantial effort into the deep learning testbed for use in the Smart Community Center. “The testbed is an enabler for the industry,” according to Said Tabet, Technology Lead, IoT Strategy for Dell EMC. “The test beds allow for better understanding end-to-end, enabling better business models and use cases.”

Sensors in the Smart Community Center are clustered in areas related to maintenance and energy consumption, including the heating and cooling systems. “Our experience is in learning from big data,” Tabet said. “Many systems are not yet ready to handle big data. So they are learning.” The real-time IIOT testbed system under development leverages deep learning for that.

Soley worked for Symbolics which sold Lisp processors for $100,000 each in 1981; that was to get access to AI. Now racks of 200 processors have 100,000 times the processing power of that Symbolics hardware, and cost a lot less. GPU chips have helped enable this boost in processing power needed to power AI.

So what is the gating factor today with this incredible increase in power? “Parallelizing the algorithms is the gating factor today,” Tabet said. Data collection may be happening at the edge, while the inference engine is running somewhere else, resulting in latency. Real-time systems may not have time to send information to the cloud and wait for it to come back, before an action is required.

Finding qualified workers who can combine knowledge of deep learning and machine learning is another challenge. “There is not enough expertise out there right now,” Soley said. Dell EMC is doing in-house training for education and innovation in AI, Tabet said.

AI, Machine Learning and Real-Time IoT

The latency issue was also cited by Michael Alperin, an industry consultant with the data science team at Tibco, which provides analytics and event-processing software, during a panel at AI World on AI, Machine Learning and Real-Time IoT.

“In practice, real-time means insights derived from data are needed at the moment they are most useful. The exact requirement depends on the use and the data update frequency,” Alperin said. For maintenance, equipment sensor data can be combined with histories of when a machine has failed in the past. “Then you can intervene before the machine goes down,” in theory, he said.

The goal in many factories is to pull all the sensor data together to get a coherent big picture. Companies seek “the ability to take all of the data being generated in a factory and predict the final product quality. That’s what we see people doing today with supervised machine learning,” Alperin said.

David Maher, EVP and CTO of Intertrust Technologies Corp., a software technology company specializing in trusted distributed computing, is helping to process signals from offshore wind farms They use predictive modeling to help in maintenance, and to manage power distribution. “Most past models are obsolete; we need AI to help match supply and demand today,” Maher said. “It’s very sophisticated. We have have solar, geothermal and wind power all combined.”

AI Chip Architecture at the Edge

The drive to put compute power at the edge is placing a burden on smaller processors, which is driving evolution in chip design. Much of today’s AI happens in the cloud today; however, “Edge computing is changing to put the AI right in the processor on the edge,” said Dr. Shriram Ramanathan, senior analyst, Lux Research, moderator of an AI World panel on Evolution of AI Chip Architecture at the Edge.

Semiconductor Energy Laboratory Co., Ltd. (SEL) of Japan is in an interesting position, having designed a chip that consumes less power, generates less heat and thus the company says is well-positioned for edge computing. “Our company deals with material science and enabling low-power devices,” said Shinji Hayakawa, in technical services with SEL. “The raw volume of data used for AI is enormous. As we send more processing to the edge, we think more people will need edge computing capability.”

Oskar Mencer is the  founder of Maxeler Technologies, which offers a dataflow approach to processing, said to result in dramatic performance improvements. Mencer said he founded the company to give the industry an alternative to microprocessors. “With AI, we have an opportunity,” he said. “We have new chip architectures, and we will probably have to change all of computer science” to implement properly on them, he suggested.

Jeff Burns, director, systems architecture and design for IBM Research, said IBM has an emphasis on AI going forward. “When we talk about AI, to get more function in smaller form factors, is a clear long-term trend,” Burns said.

The latency issues are driving innovations in edge computing, suggested Dinaker Munagala, CEO and Founder of ThinCI, a company working on deep learning and vision processing. “It’s not possible to get all the data we need out to the cloud,” he said. “Latency and bandwidth are issues in real-time systems.”

Mencer of Maxeler said, “The cloud is a great prototyping environment. We can change the software running on the cloud every hour if we want to. As we stabilize what the device needs to do, it makes no sense to send it to the cloud. It makes sense to do the processing locally.”

New chip designs “will help us push the computing industry forward,” Mencer suggested. The more data being generated, the greater the case for edge computing. “We will see more technologies deployed on the edge, which will be great for innovation,” he said.

Dr. Ramanathan asked the panel what analytics are appropriate to be done on the edge, and which in the cloud?   

Mencer said, “It’s not hard.” A high volume of data comes to the sensor, so a data reduction step is needed. You need to go from 1 Tbyte to something you can send to the cloud, based on what the purpose is. “It’s about figuring out the use cases,” he said.

Munagala of ThinCI suggested “purpose-built hardware at the edge,” will be a fit for certain AI applications. Hayakawa of SEL said, “The data processing and memory needs to come together for AI processing; the current model where data and processing are divided, might not be sustainable.”

Mencer said software written for the last 50 years, was written with little regard for hardware efficiency. “But chips are cool again, as was said here. Making your own hardware is acceptable now.”

  • By John P. Desmond, from the 2017 AI World Conference in Boston