How Far Away Are We from Autonomous Cars? 自动驾驶离我们有多远?

2020-06-24 14:07柯蒂斯·莫尔德里奇维多利亚·伍拉斯顿
英语世界 2020年5期
关键词:无人驾驶宝马驾驶员

柯蒂斯·莫尔德里奇 维多利亚·伍拉斯顿

Driverless cars used to be the sort of thing youd see in sci-fi1 films—but theyre becoming a reality. Autonomous car technology is already being developed by the likes of Lexus, BMW2 and Mercedes3, and weve even tested Teslas4 driverless Autopilot system on UK roads. Across the Atlantic, Google is developing its automated technology in the wild, and Apple is rumoured to be working with BMW on its own—probably automated—car.

Fully-driverless tech is still at an advanced testing stage, but partially automated technology has been around for the last few years. Executive5 saloons like the BMW 7 Series feature automated parking, and can even be controlled remotely.

With so much investment and interest in driverless technology, its easy to assume that self-operating cars are imminent, but before our roads are flooded with driverless vehicles, manufacturers must tackle a range of technical and ethical challenges, and combat the biggest threat to autonomous technology: humans.

The Google Car6

Autonomous vehicles rely on a range of sensors to interact with the world around them, with the Google Car prototype coming equipped with eight.

The most noticeable is the rotating roof-top LIDAR—a camera that uses an array of either 32 or 64 lasers to measure the distance between objects, building up a 3D map at a range of 200m and allowing the car to “see” hazards. The car also sports another set of “eyes”, a standard camera that points through the windscreen. This looks for nearby hazards like pedestrians, cyclists and other motorists7, as well as reading road signs and detecting traffic lights. Speaking of other motorists, bumper-mounted radar, already used in intelligent cruise control, tracks other vehicles in front of and behind the car.

Externally, the car has a rear-mounted aerial that receives geolocation information from GPS satellites, and an ultrasonic sensor on one of the rear wheels monitors the cars movements. Internally, the car has altimeters, gyroscopes and a tachometer (a rev-counter) to give even finer measurements on the cars position, all of which combine to give it the highly accurate data needed to operate safely.

Using these arrays8, the Google Car can read the road like a human, but these sensors come with their own limitations. Autonomous cars simply replace the human eye with a camera, leaving them vulnerable to extreme sunlight, weather or even defective traffic lights. In current autonomous cars, the way this selection of pixels is analysed could be the difference between a safe journey and death.

Connected cars

Many believe a connection between cars and traffic infrastructure is needed to combat this problem. “Car-to-car and car-to-infrastructure communication is essential for enabling autonomous driving,” says Christoph Reifenrath, senior manager in technology marketing of Harmans9 infotainment division, who supply in-car tech to the likes of Audi, BMW and Mercedes.

“For example, as your car approaches a red light, well give you information. How can we provide this information in every car at every red light? There has to be a solution for that for if you want to enable autonomous driving in areas with traffic lights.”

The German automotive industry is one of the most powerful advocates of a connected car-traffic infrastructure. Earlier in 2018, manufacturers including Daimler, BMW and Audi paid $3.1 billion for the Nokia Here mapping service, which will be used as a platform for a connected-car environment. A joint statement released by the consortium reads:

To become a viable solution, these systems will be required in every vehicle, including those still used by humans. Its likely that emergency vehicles10 like ambulances and police cars will continue to use human drivers, so theyll need a method of communicating with the autonomous cars around them.

“You have to know where [an emergency vehicle] comes from and when it will be there, so the information is shared between this car and your car,” adds Reifenrath.

The human problem

Humans present problems for autonomous cars as both drivers and pedestrians, and dealing with our unpredictable behaviour represents a significant challenge for the technology.

The Google Car is one of the most experienced autonomous vehicles, and its interaction with human drivers has exposed one of driverless cars main weaknesses. The first injury involving the Google Car wasnt due to a fault in its system, but human-error.

While correctly waiting at traffic lights, Googles self-driving car was hit by an inattentive driver and, despite its sophisticated array of sensors, there was little it could do to avoid the incident. Luckily, the accident only resulted in whiplash11 for a few of the passengers, but its a reminder that autonomous cars are at risk when surrounded by human road users12.

Despite their sophisticated systems, self-driving cars currently have no plan B13 for human road users. Human drivers are able to interact with each other and make allowances14, but also make countless, small mistakes when driving—mistakes to which current self-driving cars simply cant adapt.

Dealing with pedestrians

The way human drivers interact with pedestrians raises difficult moral and ethical questions for car manufacturers—with implications.

Autonomous cars need to understand the way pedestrians behave, while also mimicking the behaviour theyd expect from a human driver. “Everyone has an appreciation15 of how a human being is going to react, because we are all human beings,” says computer ethics commentator Ben Byford. “So if you walk out in front of a car, and presumably the car driver knows youre there, theyre going to react in a certain way.”

“If I walked out in front of a Google Car travelling at 60mph, I have no real appreciation of how the vehicle will behave, so Im effectively putting myself at a disadvantage.”

過去,人们只会在科幻电影中见到无人驾驶汽车——但如今它们正逐渐走向现实。眼下,雷克萨斯、宝马和梅赛德斯等汽车制造商都在研发无人驾驶技术;特拉斯Autopilot自动辅助驾驶系统甚至已在英国公路上进行了上路测试。大西洋对岸,谷歌公司正在野外研发这项技术,据称苹果公司也在与宝马合作开发自有品牌汽车(很可能是无人驾驶汽车)。

全自动无人驾驶技术目前仍处于测试阶段后期,但近几年来半自动驾驶技术已经发展到了一定程度。诸如宝马7系的豪华三厢轿车以自动泊车为卖点,甚至具备远程控制功能。

在人们的大力投资和广泛关注下,不难想象,无人驾驶汽车正离我们的生活越来越近。但在无人驾驶汽车主宰道路前,汽车制造商们必须解决一系列技术和道德难题,并设法应对无人驾驶技术最大的威胁:人类。

谷歌无人驾驶汽车

无人驾驶汽车依靠各类传感器感知周遭环境,与外部世界互动。以谷歌无人驾驶汽车为例,其原型车配备了八大传感器。

最引人注目的是安装在车顶的旋转光学雷达LIDAR——一款使用32线或64线激光雷达测量物体间距离的摄像头,它能在200米测距范围内构建三维空间地图,帮助车辆发现沿途的安全隐忧。镶嵌在挡风玻璃上的标准规格摄像头是该车的另一双“眼睛”,它不但能捕捉道路附近的各类安全威胁(如行人、自行车和其他车辆),还能识别路标和信号灯。谈到其他车辆构成的安全威胁,值得一提的还有安装在汽车保险杠上的雷达,它能追踪前后方车辆运行状况,已应用于智能巡航控制系统。

在汽车外部,谷歌汽车的车尾处设有一条天线,可接收来自GPS卫星的定位信息;此外,它的一只后轮上安装了监控车辆运动状况的超声波传感器。汽车内部则配备了高度仪、陀螺仪和测速仪(一种发动机转速表),用来更准确地测量汽车定位,为汽车安全运行提供精确的数据。

有了以上配置,谷歌汽车能像人类一样读懂路况,但这些传感器难免存在缺陷:无人驾驶汽车仅仅是用摄像头代替了人眼,这使其易受极强日光、天气或交通信号灯故障等状况的影响。然而,对目前的无人驾驶汽车来说,系统对摄像头像素数据的分析处理直接影响出行安全,事关生死。

互联汽车

为应对该问题,许多人认为有必要将汽车和交通基础设施互联互通。“车对车和车对基础设施通信技术是实现自动驾驶的必要条件。”克里斯托夫·赖芬拉特说,赖芬拉特是哈曼国际工业公司信息娱乐部门技术营销高级经理,为奥迪、宝马和梅赛德斯等汽车制造商提供车载科技支持。

“例如,我们会在汽车靠近红灯时给予信息提示。如何让每辆汽车在遇到红灯时都获得提示?若想在有交通信号灯的区域实现无人驾驶,必须提供该问题的解决方案。”

德国汽车工业是车与交通基础设施互联互通建设最具影响力的支持者之一。2018年年初,包括戴姆勒、宝马和奥迪在内的汽车制造商以31亿美元收购了诺基亚Here地图服务,将它用作互联汽车技术支持平台。这些汽车制造商联合发布了如下声明:

欲使该方案切实可行,需要给所有汽车都配备这些系统,包括人工驾驶的汽车。因为救护车和警车这类执行紧急任务的特种车辆可能仍由驾驶员驾驶,应想办法让它们与周围的无人驾驶汽车建立通信。

“无人驾驶汽车需要知晓特种车辆会在何时从何处驶来,因此两辆车之间得共享信息。”赖芬拉特补充道。

涉及“人”的难题

无论是作为驾驶员还是行人,人类都给无人驾驶技术带来了不少难题,应对人类无法预料的行为是无人驾驶技术面临的一项重大挑战。

对谷歌汽车来说,无人驾驶可谓最驾轻就熟的了,但它在与人类驾驶员互动的过程中暴露了无人驾驶汽车的一项主要弱点。谷歌汽车遭遇的首起造成人身伤害的交通事故并非由自身系统故障导致,而恰恰是因为人为错误。

当无人驾驶汽车正确听从指示等待交通信号灯时,一位心不在焉的司机开车撞了上去。即便配备了一系列精密复杂的传感器,无人驾驶汽车仍无力避免这场意外。所幸,事故仅导致几名乘客颈部扭伤,但人们因此得到了教训:无人驾驶汽车在与人类共享道路时易出事故。

尽管无人驾驶汽车系统精密,但目前还没有针对其他道路使用者(人类)的备用方案。人类驾驶员们能相互沟通和迁就,但也会在驾驶过程中出无数小差错,而无人驾驶汽车目前完全无法应付这些状况。

应对行人

从人类驾驶员与行人间的互动来看,汽车制造商面临着诸多道德伦理难题,但它们也颇具启发意义。

无人驾驶汽车需了解行人的行为模式,同时模仿行人预想中人类驾驶员的反应。“你我皆人类,对于人会作何反应,我们都心中有数。”计算机伦理学评论员本·拜福德说,“因此如果你走到一辆车跟前,假设司机知道你在那儿,他会作出特定的反应。”

“如果我闯到一辆正以每小时60英里车速行驶的谷歌无人驾驶汽车前方,由于无法按常理预测该车的行为,我实际上是使自己陷入了不利的境地。”□

(译者单位:北京航空航天大学)

猜你喜欢
无人驾驶宝马驾驶员
战“疫”需求急呼无人驾驶车冲上前线
社会版(九)
北京第一条无人驾驶地铁试运行!你敢坐吗?
请你发明
无人驾驶飞机
宝马战阵