[ad_1]

Another ship called the Mayflower will cross the Atlantic this week, but it will not carry British pilgrims — or anyone — at all.

When the autonomous ship Mayflower leaves the home port of Plymouth, England, and tries the world’s first fully autonomous transatlantic voyage, there will be a well-trained “captain” and a “navigator” who is proficient in the rules of collision avoidance at sea. “, controlled by artificial intelligence (AI).

The artificial intelligence captain of the ship was developed by IBM, guided by an expert system, and its code is widely used by the financial sector. This technology could one day help manned ships navigate difficult conditions and facilitate low-cost exploration of the ocean that covers 70% of the earth’s surface.

In a journey of about two weeks, the Mayflower maritime drone will travel through the Isles of Scilly and the site of the Titanic crash and land in Massachusetts as the colonists on the first Mayflower did more than 400 years ago. State Plymouth.

However, this stylish new ship will carry experiments instead of manned, and will have more space for experiments because it is designed without bedrooms, kitchens or bathrooms.

Inspired by the design of the space shuttle’s payload bay, experiments up to 700 kilograms can be placed in modular compartments.

“Now, it’s full,” Brett Phaneuf, managing director of MSubs, which makes Mayflower for the non-profit organization Promare and its partner IBM, told attendees International (AUVSI) at the May Xponential conference hosted by the Unmanned Vehicle Systems Association.

Science on board

Phaneuf said that many other companies, individuals, and universities have contributed experimental technology and data collection equipment.

Therefore, the Mayflower will be able to regularly study the sea level, measure the height of waves and collect water samples for testing throughout the voyage.

Although Mayflower will not be on the sidelines during this voyage, the ship’s artificial intelligence (AI) system allows it to change course on its own, for example, if scientific experiments find something worthy of further investigation [Credit: Tom Barnes for IBM/ProMare]

The Mayflower will also conduct pollution sampling and record water chemistry. The ship will be equipped with a holographic microscope to scan microplastics in water samples-plastic fragments 5 mm or smaller, which are harmful to marine life.

To determine the chemistry of the water, one of the experiments will “taste” the water through a test originally designed to detect counterfeit wine and whiskey.

Lenny Bromberg, director of IBM’s automation, intelligence, and decision management program, told Al Jazeera: “You can dip your’tongue’ into a liquid, and it will provide you with the exact chemical characteristics of the liquid you are observing.”

The Mayflower will also use hydrophones to listen to whales.

IBM worked with the Jupiter Research Foundation and Plymouth University to create models of various whales and other cetaceans found in the North Atlantic. Phaneuf said that with these models, “we will be able to determine the type and number of animals” as well as their location and general environment.

Experiment on wheels

Although Mayflower will not be on the sidelines during this voyage, the ship’s artificial intelligence system enables it to change its course on its own, for example, if scientific experiments find something worthy of further investigation.

Don Scott, one of the chief engineers of the Mayflower project, said that the artificial intelligence captain will be able to guide the operation as needed, “this is a very key difference that distinguishes it from other types of platforms.”

“Science experiments are not just passengers on the Mayflower,” Marine AI chief technology officer Scott told Al Jazeera.

To achieve this, IBM has been using its visual inspection technology and images that may be found in the ocean by the Mayflower to train artificial intelligence captains, IBM UK and Ireland Chief Technology Officer Andy Stanford Clark told the Xponential audience.

If there is something to debug, or, my goodness, an accident, we can say, “Why did you make this decision?” It will explain exactly why it made this decision.

Andy Stanford-Clark, Chief Technology Officer, IBM UK and Ireland

Using digital simulation—a kind of twin brother of the ocean—researchers place images in front of the artificial intelligence captain’s virtual camera and teach it what to do in different situations.

“That’s really our training ground [is] It gives us confidence that the AI ??captain will do the right thing when encountering something that has never been seen before,” Stanford Clark said at the meeting.

The artificial intelligence captain of the Mayflower can now obtain information from the ship’s camera and radar, IBM’s weather service, and coastal maps and telemetry broadcast by the ship through its automatic identification system.

Stanford-Clark explained that the artificial intelligence captain puts these parameters and other factors (such as the ship’s battery power, wind speed, and wind direction) into a “big optimizer”.

He said, then the system will generate a response and decide based on the constraints, “What is the next best thing you can do? Where should you go, at what speed, and in what direction?”

Frontier Decision

But the decision-making process of an artificial intelligence system may be unclear. Sea travel is a regulated activity, and understanding how the decision is made is crucial-so the team goes one step further.

It adds an IBM Operational Decision Manager (ODM)-a rule-based expert system with a long history in the financial industry. Stanford Clark said it is “very, very good at parsing rules.”

The team provided the ODM with the rules in the International Convention on Regulations for Preventing Collisions at Sea (COLREGS) so that it can understand maritime rules.

Scientific experiments are not just passengers on the Mayflower.

Don Scott, Chief Technology Officer, Marine AI

And because ODM is a rule-based system, “it is completely interpretable,” Stanford-Clark said, and can provide “an audit trail that it decides.”

“If there is something that needs to be debugged, or heaven does not allow it, and an accident occurs, we can say,’Why did you make this decision?’ It will explain exactly why it made this decision,” he said.

It is this technology that attracted the interest of Larry Mayer, professor and director of the Center for Coastal and Marine Surveying and Mapping at the University of New Hampshire and one of the project leaders of the Undersea 2030 project.

“This is a big and difficult thing,” Meyer told Al Jazeera. “I think we will all be very, very interested in understanding how it works, and I very much hope that it will work well.”



[ad_2]

Source link