Please wait, while we are loading the content...
Please wait, while we are loading the content...
| Content Provider | Springer Nature Link |
|---|---|
| Author | Young, Gin Shu Herman, Martin Hong, Tsai Hong Jiang, David Yang, Jackson C.S. |
| Copyright Year | 1998 |
| Abstract | For autonomous vehicles to achieve terrain navigation, obstacles must be discriminated from terrain before any path planning and obstacle avoidance activity is undertaken. In this paper, a novel approach to obstacle detection has been developed. The method finds obstacles in the 2D image space, as opposed to 3D reconstructed space, using optical flow. Our method assumes that both nonobstacle terrain regions, as well as regions with obstacles, will be visible in the imagery. Therefore, our goal is to discriminate between terrain regions with obstacles and terrain regions without obstacles. Our method uses new visual linear invariants based on optical flow. Employing the linear invariance property, obstacles can be directly detected by using reference flow lines obtained from measured optical flow. The main features of this approach are: (1) 2D visual information (i.e., optical flow) is directly used to detect obstacles; no range, 3D motion, or 3D scene geometry is recovered; (2) knowledge about the camera-to-ground coordinate transformation is not required; (3) knowledge about vehicle (or camera) motion is not required; (4) the method is valid for the vehicle (or camera) undergoing general six-degree-of-freedom motion; (5) the error sources involved are reduced to a minimum, because the only information required is one component of optical flow. Numerous experiments using both synthetic and real image data are presented. Our methods are demonstrated in both ground and air vehicle scenarios. |
| Starting Page | 45 |
| Ending Page | 71 |
| Page Count | 27 |
| File Format | |
| ISSN | 09205691 |
| Journal | International Journal of Computer Vision |
| Volume Number | 28 |
| Issue Number | 1 |
| e-ISSN | 15731405 |
| Language | English |
| Publisher | Kluwer Academic Publishers |
| Publisher Date | 1998-01-01 |
| Publisher Place | Boston |
| Access Restriction | One Nation One Subscription (ONOS) |
| Subject Keyword | Artificial Intelligence (incl. Robotics) Computer Imaging, Graphics and Computer Vision Image Processing Automation and Robotics |
| Content Type | Text |
| Resource Type | Article |
| Subject | Artificial Intelligence Computer Vision and Pattern Recognition Software |
National Digital Library of India (NDLI) is a virtual repository of learning resources which is not just a repository with search/browse facilities but provides a host of services for the learner community. It is sponsored and mentored by Ministry of Education, Government of India, through its National Mission on Education through Information and Communication Technology (NMEICT). Filtered and federated searching is employed to facilitate focused searching so that learners can find the right resource with least effort and in minimum time. NDLI provides user group-specific services such as Examination Preparatory for School and College students and job aspirants. Services for Researchers and general learners are also provided. NDLI is designed to hold content of any language and provides interface support for 10 most widely used Indian languages. It is built to provide support for all academic levels including researchers and life-long learners, all disciplines, all popular forms of access devices and differently-abled learners. It is designed to enable people to learn and prepare from best practices from all over the world and to facilitate researchers to perform inter-linked exploration from multiple sources. It is developed, operated and maintained from Indian Institute of Technology Kharagpur.
Learn more about this project from here.
NDLI is a conglomeration of freely available or institutionally contributed or donated or publisher managed contents. Almost all these contents are hosted and accessed from respective sources. The responsibility for authenticity, relevance, completeness, accuracy, reliability and suitability of these contents rests with the respective organization and NDLI has no responsibility or liability for these. Every effort is made to keep the NDLI portal up and running smoothly unless there are some unavoidable technical issues.
Ministry of Education, through its National Mission on Education through Information and Communication Technology (NMEICT), has sponsored and funded the National Digital Library of India (NDLI) project.
| Sl. | Authority | Responsibilities | Communication Details |
|---|---|---|---|
| 1 | Ministry of Education (GoI), Department of Higher Education |
Sanctioning Authority | https://www.education.gov.in/ict-initiatives |
| 2 | Indian Institute of Technology Kharagpur | Host Institute of the Project: The host institute of the project is responsible for providing infrastructure support and hosting the project | https://www.iitkgp.ac.in |
| 3 | National Digital Library of India Office, Indian Institute of Technology Kharagpur | The administrative and infrastructural headquarters of the project | Dr. B. Sutradhar bsutra@ndl.gov.in |
| 4 | Project PI / Joint PI | Principal Investigator and Joint Principal Investigators of the project |
Dr. B. Sutradhar bsutra@ndl.gov.in Prof. Saswat Chakrabarti will be added soon |
| 5 | Website/Portal (Helpdesk) | Queries regarding NDLI and its services | support@ndl.gov.in |
| 6 | Contents and Copyright Issues | Queries related to content curation and copyright issues | content@ndl.gov.in |
| 7 | National Digital Library of India Club (NDLI Club) | Queries related to NDLI Club formation, support, user awareness program, seminar/symposium, collaboration, social media, promotion, and outreach | clubsupport@ndl.gov.in |
| 8 | Digital Preservation Centre (DPC) | Assistance with digitizing and archiving copyright-free printed books | dpc@ndl.gov.in |
| 9 | IDR Setup or Support | Queries related to establishment and support of Institutional Digital Repository (IDR) and IDR workshops | idr@ndl.gov.in |
|
Loading...
|