Skip to content
  • Our Work
    • Fields
      • Cardiology
      • ENT
      • Gastro
      • Orthopedics
      • Ophthalmology
      • Pulmonology
      • Surgical Intelligence
      • Surgical Robotics
      • Urology
      • Other
    • Modalities
      • Endoscopy
      • Medical Segmentation
      • Microscopy
      • Ultrasound
  • Success Stories
  • Insights
    • Magazine
    • Upcoming Events
    • Webinars
    • Meetups
    • News
    • Blog
  • The company
    • About us
    • Careers
Menu
  • Our Work
    • Fields
      • Cardiology
      • ENT
      • Gastro
      • Orthopedics
      • Ophthalmology
      • Pulmonology
      • Surgical Intelligence
      • Surgical Robotics
      • Urology
      • Other
    • Modalities
      • Endoscopy
      • Medical Segmentation
      • Microscopy
      • Ultrasound
  • Success Stories
  • Insights
    • Magazine
    • Upcoming Events
    • Webinars
    • Meetups
    • News
    • Blog
  • The company
    • About us
    • Careers
Contact

Sensors for ADAS Technology: 3. RADARs

This article is the third part in a series of three articles about sensors used by the automotive industry to allow perception on autonomous vehicles.

RGB cameras, LiDARs and Radars are the three main sensors used by the automotive industry to maintain the perception for autonomous vehicles at various levels of autonomy. Each of these three different methods in ADAS Technology has advantages and disadvantages, so that the ideal system would be a combination of all three.

Radars

Radar technology may be used to complete the information coming from the other sensors. It transmits high frequency radio waves to get range, direction and velocity of objects. It works under any weather condition and at night. One of its shortcomings is that it does not identify small features. Radars are already used today to control Adaptive Cruise Control (ACC): they measure successfully the front distance to other cars and adapts to their speed.

We started by saying that each sensor’s disadvantage can be handled by a combination (fusion) of their data. Sensor fusion is achieved by one central computer that integrates all the information it receives to form a complete view (perception). For example: When LIDAR shows a set of dots at some distance, RGB camera’s image is used to identify the object using features and colors. To overcome the heavy data sets produced by the sensors, some systems today are using edge CPUs. They are located on the sensors, performing an initial process work (generally compression of data), so that the central computer can do the fusion and the analysis faster and more efficiently.

Read the first article (about RGB cameras) and the second article (about LiDARs) in this series about sensors for ADAS. RSIP Vision‘s engineers are expert in autonomous driving and ADAS technology. Call us and we will help you with your ADAS technology project.

Share

Share on linkedin
Share on twitter
Share on facebook

Related Content

Percutaneous Nephrolithotomy

PCNL – Planning and real-time navigation

Prostate Tumor Segmentation

Implementing AI to Improve PI-RADS Scoring

RAS Navigation

Tissue Sparing in Robotic Assisted Orthopedic Surgeries

Procedural Planning in urology

Procedural Planning in Urology

C Arm X-Ray Machine Scanner

Radiation Reduction in Robotic Assisted Surgeries (RAS) Using AI

Visible spectrum color

Hyperspectral Imaging for Robotic Assisted Surgery

Percutaneous Nephrolithotomy

PCNL – Planning and real-time navigation

Prostate Tumor Segmentation

Implementing AI to Improve PI-RADS Scoring

RAS Navigation

Tissue Sparing in Robotic Assisted Orthopedic Surgeries

Procedural Planning in urology

Procedural Planning in Urology

C Arm X-Ray Machine Scanner

Radiation Reduction in Robotic Assisted Surgeries (RAS) Using AI

Visible spectrum color

Hyperspectral Imaging for Robotic Assisted Surgery

Show all

RSIP Vision

Field-tested software solutions and custom R&D, to power your next medical products with innovative AI and image analysis capabilities.

Read more about us

Get in touch

Please fill the following form and our experts will be happy to reply to you soon

Recent News

PR – Intra-op Virtual Measurements in Laparoscopic and Robotic-Assisted Surgeries

PR – Non-Invasive Planning of Coronary Intervention

PR – Bladder Panorama Generator and Sparse Reconstruction Tool

PR – Registration Module for Orthopedic Surgery

All news
Upcoming Events
Stay informed for our next events
Subscribe to Our Magazines

Subscribe now and receive the Computer Vision News Magazine every month to your mailbox

 
Subscribe for free
Follow us
Linkedin Twitter Facebook Youtube

contact@rsipvision.com

Terms of Use

Privacy Policy

© All rights reserved to RSIP Vision 2021

Created by Shmulik

  • Our Work
    • title-1
      • Ophthalmology
      • Uncategorized
      • Ophthalmology
      • Pulmonology
      • Cardiology
      • Orthopedics
    • Title-2
      • Orthopedics
  • Success Stories
  • Insights
  • The company