This is the next blog in the continuing series of interviews with leading professionals.
In this blog series, we continue our talk with Professor Hideyuki Nakashima: President Future University – Hakodate; the internationally renowned computer scientist, inspirational visionary and top-ranked leader. To find out more about Dr. Nakashima, go to the first blog in this series.
I had the pleasure of meeting Dr. Nakashima at the invitation-only international large society summit where CIPS participated/presented. Hideyuki’s reputation for innovation and leadership is well known worldwide and this led to this blog series. Dr. Nakashima was the youngest professor to ever be offered the presidency of a university in the history of Japan, due to his significant and outstanding achievements. Dr. Nakashima is creating new branches of computer science and merging others that will impact the world into the future. It’s worth your time to follow his work. There is an element of “Star Trek” which I find compelling and in talking with Hideyuki, I can feel his passion…I wanted to share this with you too!
Thank you and Enjoy!
Stephen: What kinds of problems will we be solving in 2020? When will raw processing surpass our human abilities?
Hidey: 2020, more than a decade from now, cannot be predicted with precision. Anyway, the future is not to predict but to design. The goal of Cyber Assist Project was to design such a future.
Stephen: Describe your work as Director of the Cyber Assist Research Center. Which areas of your work are you most proud of?
Hidey: Cyber Assist Research Center was proposed by me to AIST and accepted. Cyber Assist project, which was carried out at the center was one of the first projects on ubiquitous computing in Japan. Here is a quote of the description of the project from one of my papers:
The current information processing tools such as personal computers and Internet are not always easy to use. Novice users often have to take a class to master them. A new trend in research was initiated recently to turn the table around. MIT’s Oxygen project, Microsoft’s Easy Living and Hewlett Packard’s Cool Town as well as our Cyber Assist project are among those announcing their new directions: making IP machines invisible from human users and yet providing rich ubiquitous supporting environment.
The goal of Cyber Assist project is to develop human-centered IP assistance systems (intelligence booster) usable without special knowledge or training. We also address the problems of information overload and privacy.
Our target is to propose a plan of the future cities with information feedback control systems. It is achieved through sensors, actuators and information processing over them. Therefore, our use of “cyber” differs from those used by mass media where the word is synonymous to “digital”.
In fact, we define
cyber = digital + real
meaning that cyber world implies grounding of digital, or logical, information to the real, or physical, world.
We have two main targets:
- Situated information support, and
- Privacy protection.
These targets often contradict each other. You have better chance of getting personally tailored services when you submit more personal information, like your preference and previous experience. The key issue is that the users getting the services must have the control of their personal information, not the service provider. In this respect, we believe it important that the communication itself is anonymous, and personal identity, if necessary, must be given as the content of the communication.
To achieve the above goals, we have two main approaches:
- Intelligent content, and
- location-based communication.
Communication method and the content are closely related each other and should not be separately designed. We believe one of the important grounding of content is to the location (of its existence, use, etc.).
One of the products of the project is a battery-less information terminal called Aimulet, which was actually used in the Global House (the Japanese government pavilion) and Laurie Anderson’s “Show and Walk” at Aichi Expo 2005.
(A brief description can be seen in
Aimulet LA received a good design award (G-mark) from the Ministry of Economy, Trade and Industry this year.
Stephen: “The goal of Cyber Assist project is to develop human-centered IP assistance systems (intelligence booster) usable without special knowledge or training.” Can you describe other examples of this concept in action?
Hidey: Actually, it is technically harder to design such systems than to design sophisticated devices with lots of control adjustments. The best example of a system usable without special knowledge of the system is a car. To drive a car, you do not need any knowledge about its mechanism. ABS and sophisticated engine control is achieved where users do not notice. However, when the designer of a car tries to give some control to the users, the landscape changes. New BMW 7 series is computerized and its manual is over 500 pages. Thus information processing version of a basic car is hard to find these days. Evolution of mobile phones seems to be going to two directions: (1) harder to use but with a lot of sophisticated mechanisms and (2) easy to use with simple fixed functions.
Intelligent boosters are much much harder to find. We have to wait until semantic computing becomes widely available.
Stephen: “We also address the problems of information overload and privacy.” In what ways do you do this?
Hidey: Semantic computing is half of the solution. An intelligent search engine is a key to cope with information overload. The rest of the solution is ID-free communication. As long as you use a global ID to specify the address of communication, your privacy is vulnerable. We proposed “location-oriented-communication” in which the target address is specified using the physical location of the user rather than her ID. Any person at the location can get the service without giving up his privacy.
Stephen: “Our target is to propose a plan of the future cities with information feedback control systems. It is achieved through sensors, actuators and information processing over them.” What pending projects take this further-can you overview them?
Hidey: There are new research and development field variously called “ubiquitous computing”, “pervasive computing” and “ambient intelligence”. Ambient intelligence is an EU project. R&D on sensor networks to identify the context of service recipient is also active.
Stephen: Where do you see this heading into the future: societal applications of information processing technologies?
Hidey: As we all see, the Internet is changing some of our life and business styles. In addition, maturation of ubiquitous computing technology, particularly advances in positioning and telecommunication systems, allows us to design advanced assistance systems for many aspects of our everyday-lives.
We can not only increase the efficiency of the current system, but we can design a whole new system that was unable to install without IT.
A mass user support system would have a large impact on how our societal infrastructure is designed and operated. A new societal design concept would benefit not only the society as a whole but would also benefit individuals. However, societal systems are hard to design and harder to test in the real world. To design a mass user support system, multiagent simulation is very effective. I gave an example of such simulation for new transportation systems such as bus-on-call. Rescue simulation is another good example. Large-scale disaster is so rare that we need to use simulators to plan and test good rescue strategy. Such simulation was put into test using Kyoto station in Digital City Kyoto project. Simulator is also effective to train rescue personnel.
Let me take this opportunity to explain my definitions of differences of two kinds of IT, namely ICT and IPT. The Internet is an example of information communication technology (ICT) where computing power is used to transfer information from one place to another. Although data manipulations such as search and format transformation (typically encryption and decryption, compaction and expansion) are performed, no essential data manipulation (such as image or language understanding) is involved in the process. It is humans that creates and consumes information.
In information processing technology (IPT), on the other hand, it is a computer that creates, manipulates and understands information. Data mining is one of the examples. Computers discover new information that was hidden to human eyes. IPT is the core technology to support so called “Web 2.0” while ICT is the core for Web 1.0.
ICT enhances human activities. Internet provides people accessibility to world-wide information resource from anywhere in the world. The technology changed some of our economical systems. Online-shopping is the best known example. However, it is still necessary to store and transport goods in the traditional way. ICT is important, but has its own limitations unless we change the underlining societal systems. With IPT, we can go further. Free flight system using GPS locating system is one of the examples. Instead of traditional air traffic navigation, each aircraft can fly autonomously with the help of multiagent collision avoidance system. Mercedes Benz and other companies are conducting research on future transportation and delivery control system.
Stephen: “Computers discover new information that was hidden to human eyes. IPT is the core technology to support so called “Web 2.0″ while ICT is the core for Web 1.0.” Hidey, how would you describe the elements that make up Web 2.0 and where it is going? There is an increasing trend towards social communities online and user sorting and tagging?
Hidey: Web 2.0 is hard to define. It is a combination of many programs that provide various kinds of user interfaces. However, one common aspect is the use of huge computing resources (both CPU and memories with more emphasis on CPU power). In my opinion, blogs and video/photo sharing are realizable with old technologies of Web 1.0. Although they may need a vast storage, they do not need much CPU powers. Wiki’s may be a bit different. The current Wiki relies on human editors. But in the near future, computers may take it over. They can automatically gather information through Web, filter out unreliable information and prepare Wiki pages for humans to read. Of course, initial tags may still have to be given by humans. But part of it may also be automated. For example, when you take a picture, GPS information of location and orientation may automatically be added to the picture. Even semantic inference of the target and situation using other sensors may become possible.
In the next blog, Hidey will talk about:
- Cyber Assist applications;
- Small payment systems;
- Information abuse;
- Information Societies;
- Practitioner involvement;
- IT curriculum changes;
I also encourage you to share your thoughts here on these interviews or send me an e-mail at email@example.com.