Document Type : Original Research Paper


Department of Electronics, Faculty of Electrical Engineering, Shahrood University of Technology, Shahrood, Iran


Background and Objectives: Lane detection systems are an important part of safe and secure driving by alerting the driver in the event of deviations from the main lane. Lane detection can also save the lifes of car occupants if they deviate from the road due to driver distraction.
Methods: In this paper, a real-time and illumination invariant lane detection method on high-speed video images is presented in three steps. In the first step, the necessary preprocessing including noise removal, image conversion from RGB colour to grey and the binarizing input image is done. Then, a polygon area as the region of interest is chosen in front of the vehicle to increase the processing speed. Finally, edges of the image in the region of interest are obtained with edge detection algorithm and then lanes on both sides of the vehicle are identified by using the Hough transform.
Results: The implementation of the proposed method was performed on the IROADS database. The proposed method works well under different daylight conditions, such as sunny, snowy or rainy days and inside the tunnels. Implementation results show that the proposed algorithm has an average processing time of 28 milliseconds per frame and detection accuracy of 96.78%.
Conclusion: In this paper a straightforward method to identify road lines using the edge feature is described on high-speed video images.

©2021 The author(s). This is an open access article distributed under the terms of the Creative Commons Attribution (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, as long as the original authors and source are cited. No permission is required from the authors or the publishers.


Main Subjects

Open Access

This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit:


Publisher’s Note

JECEI Publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.



Shahid Rajaee Teacher Training University


Journal of Electrical and Computer Engineering Innovations (JECEI) welcomes letters to the editor for the post-publication discussions and corrections which allows debate post publication on its site, through the Letters to Editor. Letters pertaining to manuscript published in JECEI should be sent to the editorial office of JECEI within three months of either online publication or before printed publication, except for critiques of original research. Following points are to be considering before sending the letters (comments) to the editor.

[1] Letters that include statements of statistics, facts, research, or theories should include appropriate references, although more than three are discouraged.

[2] Letters that are personal attacks on an author rather than thoughtful criticism of the author’s ideas will not be considered for publication.

[3] Letters can be no more than 300 words in length.

[4] Letter writers should include a statement at the beginning of the letter stating that it is being submitted either for publication or not.

[5] Anonymous letters will not be considered.

[6] Letter writers must include their city and state of residence or work.

[7] Letters will be edited for clarity and length.