August. Still very warm morning. An interesting subject led us to one of the offices of SkyWay design organization for a conversation with a specialist in the field of artificial intelligence and automated control systems Yuri Sorokin, PhD in Engineering Science.
The beginning of the working day, the office is beginning to fill up with buzz. We are passing through the corridors to a spacious office. An awesome beautiful view of the National library opens up from there, from the 9th floor.
We came to Yuri to talk about the safety of SkyWay transport complexes. At the same time, we learned about the difference between artificial intelligence applied in SkyWay and unmanned vehicles, the way unibuses communicate with each other and what movie is worth watching if you want to know about artificial intelligence development.
About the need for intelligent systems in detail
Artificial intelligence (AI) is too idealized. Largely due to marketing. The development of AI today is at an initial stage. When man decided to train a machine to analyze information and make decisions, he began by adding certain algorithms into it. The result depended on the ability to foresee everything. It was obviously impossible. Therefore, there arose the need to teach the computer to think, to reason, to be able to adapt. Professionals all over the world are now trying to implement it. But still computer cannot do without a human.
In what does artificial intelligence look like a child?
Somewhere in an ideal universe, children can learn from their mistakes. They explore the world themselves, and the cost of their mistakes is incommensurably small. However, it’s in a perfect universe. And in the real world, you wouldn’t want your son to stick his fingers in the socket and find out that electricity hurts painfully.
The same situation is with artificial intelligence. For example, there is a machine, there is a labyrinth. A small car is started up there, so that it would look for the exit. The car drove into a dead end, remembered that there was no exit there and returned back. The cost of such an error is small – the risk is appropriate.
But, suppose, an obstacle appears before the car in the process of movement. For example, a truck standing on the roadway, braking distance to which is not enough. The car, tuned to strict compliance with traffic rules, will face a challenge: to cross a full-line and go out to a free lane of oncoming traffic or slow down and allow a collision with possible consequences for passengers’ health. Self-learning car can make a choice: not to violate traffic rules, and then, according to the results of consequences evaluation, it will conclude that the decision was wrong. But you would hardly like to become a laboratory guinea pig and to train a car at the expense of your health. Therefore, self-learning of AI is only possible when the consequences of making a wrong decision are small.
Accordingly, the role of man in the construction of systems with AI is undeniably high. Initially, it is he who should teach this system. And the quality of training will depend directly on the skill of developers, programmers, on everyone who teaches it. The car must be under control at all times.
How do we teach SkyWay transport?
Let’s take, for example, the facial recognition system installed in SkyWay transport complex. In order for it to learn how to detect cases of health deterioration, acts of vandalism or violence, unwanted passengers, it had to learn to recognize faces initially in principle. It was offered a huge number of images of people and situations, including illegal ones. The system identified certain patterns in this way to eventually identify people to ensure access control to the services of the transport complex and detect situations with forgotten things or fights.
Training of string transport goes on constantly. Information for analysis is recorded during all races, regardless whether testing of AI or suspension system is carried out. Therefore, the dataset is continuously updated.
Drone manufacturers are also constantly collecting information for AI training from their cars as they cruise along the streets. However, in their case, it is necessary so as to teach a car to react to all the dangers encountered on the way, and to protect against threats from the outside. In the case with SkyWay transport, AI has more humane goals. Having raised the track to the second level, we have got rid of external threats. Therefore, the AI learns to monitor safety inside the transport and calculate the fastest and most convenient route tasks. When unibuses go into mass production and appear on the streets of cities, AI will also continue to learn, but this will never be a threat to human safety.
Comparing string transport and drones
Both drones and SkyWay transport systems use several types of sensors: optical (cameras) and radars. What are their benefits and disadvantages?
Optical sensors with well-trained AI can easily recognize objects. However, they are still far away from the man’s level. After all, the camera sees no more than a set of pixels. Although it can detect a specific object from a given list in the crowd faster. But the camera measures distance and speed badly. Another drawback is that it is very critical to external conditions (light, rain, fog). Yes, there are various ways to deal with them, but they significantly increase the cost of the surveillance system.
The second sensor, which is widely used to detect objects, both in string transport and in most drones, is radar. It is considered to be all-weather, perfectly measuring the distance to an object, its motion parameters. But because of the small volume of classifying features it almost does not identify detected objects. And it perceives everything as an obstacle. During the tests at the test site, we see a reflection from everything: grass, supports, infrastructure. And if the vehicle thought it were traffic obstacles, it would remain to stay still.
To improve the information received from cameras and radars, we have created programs that combine data from optical and location sensors. You could see them in action during the demonstration of Anatoly Yunitsky’s “doggie”. We have taught the AI to supplement one information with another. So as to make the best decisions based on these data and to ensure safe movement of vehicles.
The third type of sensors used by some drones, but not by SkyWay Technologies Co., are lidars. A thin laser beam scans the area and gets excellent optical reproduction. At the same time, it defines the motion parameters well. It would seem that everything is fine, except for some “but”, because of which we don’t use lidars for string transport. First, it is the high cost. Secondly, a lidar is a mechanical device that has the ability to break. Third, it is an optical device. Therefore, in the rain or fog, when the laser beam is dispersed, it sees a wall. In our opinion, it is better to use lidar in California. Taking into account that we are orienting on a diverse market and testing transport in Belarus, where rains and fogs are a natural practice, the use of lidars is impractical for us.
Almost SkyNet
As a rule, each unmanned vehicle is considered as an independent unit. It is self-sufficient, with certain parameters and preset information, such as a map of an area. And the movement of such units is chaotic, which, in its turn, leads to low throughput capacity.
In SkyWay, each element of the infrastructure is also endowed with “intelligence and quick wits”, plus there is also a central intelligent control system (CICS). It controls the entire transport complex in accordance with the posed tasks. It monitors the speed and safety of movement, combines traffic flows and responds to the wishes of each individual client to ensure maximum comfort for him. CICS calculates the most convenient and optimal route tasks with constant updating of incoming data.
Another difference between SkyWay intelligent systems and unmanned vehicles is the ability to respond quickly to emergencies. Failure of any vehicle results in an instant recalculation of the route assignment for the rest of the pods linked with this route. And also in acceptance of measures needed to solve the current situation: evacuation, transportation, etc. That is difference of our system from competitors’ systems is the global traffic flow control without compromising the individuality of each separate vehicle.
What do vehicles say?
However, communication between vehicles is not limited by the centralized one. After all, the failure of one of its elements could lead to a collapse. Therefore, each vehicle communicates with all elements of the transport infrastructure. Let’s assume the impossible that CICS ceased to function. Each individual pod knows its task, communicates with other pods that are nearby, and relays information about the environment. Therefore, all vehicles can continue running along the route and respond to any changes regardless of CICS operation.
Intelligent control system in action
Development of the technology of vehicle evacuation is an example of how vehicles executed a task from CICS. But they also communicated directly during the docking.
“Doggie” is another specific example of work that reveals the capabilities of our systems – the movement of vehicles in a virtual coupling. From the safety point of view, the distance between the vehicles should depend on the braking distance, which is necessary to stop the second one when the first stops. At the same time, it is desirable that it would be as much as possible. On the other hand, to increase passenger traffic it is necessary to reduce this distance. How can a balance be found?
As regards the virtual coupling, we are not the pioneers of this research. The work was carried out by our other competitors. However, we have applied and modernized this concept. The first vehicle is the leading one, it performs its route task. The rest ones control and change their movement parameters: distance, interval, time to the module in front.
Why do we need virtual coupling?
For example, there is a task: to service goods and passengers on the route Minsk-Moscow as quickly as possible. To solve it, we will need to maintain a minimum interval between modules. And we will manage to achieve this only through a virtual coupling. The task from CICS is to make up a virtual coupling depending on the existing traffic schedules and individual needs of passengers.
The Head of Intelligent Systems Administration of SkyWay Technologies Co. Evgeny Rodchenkov appeared in the office. He gets interested in the topic of conversation and complements Yuri Adamovich:
By the way, in terms of commercial benefits, a virtual coupling will reduce energy consumption, increase passenger traffic, and, correspondingly, profit. It will help to reduce the load when driving in a stream, as it works in the case of trucks, to reduce energy consumption and increase passenger traffic.
Yuri Adamovich continues to explain the technical side of the issue.
Calling the unibus a “doggie”, Mr. Yunitsky has not only demonstrated adaptive following after a leader that is needed to implement the virtual coupling. He also showed the execution of some commands, quite unusual for the system: escape, pursuit. You see, when Mr. Yunitsky said that the problem solved by our company, is very serious, he did not boast and palter at all. We are indeed in the lead on this issue.
What does unibus see?
Returning to the issues of technical vision, let me remind you how Anatoly Yunitsky said at one point of the demonstration, “The system sees a lot of unnecessary objects and interferences.” Because, unlike smooth asphalt, we worked on an imperfect surface. And the fact that the system developed by us has learned to select the necessary objects and process the necessary information at the level of surface distortions, speaks of tremendous success.
If there were other people in the way of unibus during the tests, it would work them out on the principle of the greatest potential threat, calculating the intersection of their routes based on the parameters of the movement of both the vehicle and potentially dangerous objects. After all, the nearest can move away, and standing in the distance - to approach.
Intelligent systems against vandalism
Our technical vision system already recognizes a weapon inside and outside the vehicle, sends a signal “potentially dangerous situation detected” to the dispatcher. It is he who must take further action. After all, man still plays the most important role, because the cost of failure will be quite high if, say, a person appears in the cabin not with a toy gun, but with a real one. And we will allow the artificial intelligence to choose what to do.
However, it is not easy to recognize acts of vandalism and potentially dangerous situations, such as an aggressive behavior. AI will have to learn to distinguish a simple slap on the shoulder from a blow or a greeting gesture from a threat. But we’re using huge databases of datasets so that vehicles would be finally able to do it.
Evgeny Rodchenkov adds, “By the way, there is a film in which not a person teaches a robot, but vice versa. It’s called “I'm a mother.” Its moral is how the decision is chosen, why in this way, and who is still smarter – a man or a robot."
Why won’t robots take over the world soon?
Man’s flights into space, undertaken in spite of all dangers, the decision to sacrifice himself during an emergency in favor of saving another person and even the idea of “sending a courier for another bottle of wine”, when it would seem, it is time to stop – all these are irrational decisions. Only a man is capable to do that. And it is not known how long it will take to teach a machine to do it, and whether it will work at all. Therefore, it is too early to fear a robot rebellion. After all, they resemble small children so far, and we are very proud of “ours” in SkyWay.
You can assess the importance of a particular publication and the level of its preparation. Share your opinion in the comments!
This form asks for your consent to allow us to use your personal data for the reasons stated below. You should only sign it if you want to give us your consent.
Who are we?
The name of the organisation asking you for consent to use your information is:
Global Transport Investments
Trident Chambers, P.O. Box 146, Road Town
Tortola
British Virgin Islands
We would like to use the following information about you:
Why would we like to use your information?
Global Transport Investments would like to send this information to company registry, inform you about its news, for refund purposes.
What will we do with your information?
We store your name, address, ID Data, date of birth into company registry. We will share your e-mail & phone number with IT Service (https://digitalcontact.com/), SMS Center (http://smsc.ru). They will add your details to their mailing list and, when it is news update, they will send you an email or sms with details. We store your credit card number for possible refunds.
How to withdraw your consent
You can withdraw the consent you are giving on this form at any time. You can do this by writing to us at the above address, emailing us at the address: [email protected] or by clicking on the unsubscribe link at the bottom of emails you receive.
This privacy notice tells you about the information we collect from you when you sign up to receive our regular newsletter via our website. In collecting this information, we are acting as a data controller and, by law, we are required to provide you with information about us, about why and how we use your data, and about the rights you have over your data.
Who are we?
We are Global Transport Investments. Our address is Trident Chambers, P.O. Box 146, Road Town, Tortola, British Virgin Islands. You can contact us by post at the above address, by email at [email protected].
We are not required to have a data protection officer, so any enquiries about our use of your personal data should be addressed to the contact details above.
What personal data do we collect?
When you subscribe to our newsletter, we ask you for your name and your email address.
Why do we collect this information?
We will use your information to send you our newsletter, which contains information about our products.
We ask for your consent to do this, and we will only send you our newsletter for as long as you continue to consent.
What will we do with your information?
Your information is stored in our database and is shared with with IT Service (https://digitalcontact.com/), SMS Center (http://smsc.ru). It is not sent outside of the Euro. We will not use the information to make any automated decisions that might affect you.
How long do we keep your information for?
Your information is kept for as long as you continue to consent to receive our newsletter.
Your rights over your information
By law, you can ask us what information we hold about you, and you can ask us to correct it if it is inaccurate.
You can also ask for it to be erased and you can ask for us to give you a copy of the information.
You can also ask us to stop using your information – the simplest way to do this is to withdraw your consent, which you can do at any time, either by clicking the unsubscribe link at the end of any newsletter, or by emailing, writing us using the contact details above.
Your right to complain
If you have a complaint about our use of your information, you can contact the Information Commissioner’s Office.
Всегда любил фантастику о роботах и искусственном интеллекте.
Но наблюдаю, что технологии развиваются всё стремительнее, и входят в нашу жизнь. И то, что ещё вчера было фантастикой, сегодня уже реальность.
Считаю, важно просвещать народ, чтоб люди лучше понимали суть всех новых технологий, и не испытывали перед ними замешательство или иррациональный страх.
И самое главное почему не возможно отправить её из личного кабинета, так что бы видео отправленное в соцсети было бы сразу же с реферальной ссылкой???