Enhancing Cognitive Assistance Systems with Inertial Measurement Units

· Studies in Computational Intelligence 第 105 本图书 · Springer
电子书
146
评分和评价未经验证  了解详情

关于此电子书

1.1 Motivation Throughout the last decade, major car manufacturers have introduced a great - riety of di?erent assistance systems to make driving safer and more comfortable. Mechatronic systems like dynamic stability control, situation dependant suspension adjustment or the recently introduced night vision cameras are just a few examples among a range of helpful developments. The current focus of research is on cognitive assistance systems for high-level p- ception of the environment. Rather than present mere images to the driver, these future systems extract all relevant information and provide a reduced set that alerts to possible dangers or, in a further stage, directly takes over control of the vehicle to prevent accidents. TheseStepsTowardstheSeeingCar were presented as a keynote speech at theAdvancedMicrosystemsforAutomotiveApplications2007 conference byVolkswagen [87]. In addition to radar, ultrasonic and laser sensors, concepts include video cameras for the perception of the medium and far range. Video will play an especially important role in future driver assistance, since the tra?c environment is designed for human needs, and thus relies predominantly on visual cues. For the visual assessment of the environment, two aspects are crucial and will be covered in this book. First, the vehicle has to know its own motion in space in order to reliably di?erentiate visual motion caused by other objects, such as moving pedestrians, from that of the static environment generated by the car’s own movement. Second, the visual sensor needs to cover a wide range with high resolution and precision.

为此电子书评分

欢迎向我们提供反馈意见。

如何阅读

智能手机和平板电脑
只要安装 AndroidiPad/iPhone 版的 Google Play 图书应用,不仅应用内容会自动与您的账号同步,还能让您随时随地在线或离线阅览图书。
笔记本电脑和台式机
您可以使用计算机的网络浏览器聆听您在 Google Play 购买的有声读物。
电子阅读器和其他设备
如果要在 Kobo 电子阅读器等电子墨水屏设备上阅读,您需要下载一个文件,并将其传输到相应设备上。若要将文件传输到受支持的电子阅读器上,请按帮助中心内的详细说明操作。