Skip to main navigation Skip to search Skip to main content
5   Link opens in a new tab Citations (SciVal)
43 Downloads (Pure)

Abstract

This work presents a low-cost sensor and machine learning methods approach for plastic recognition in daily used objects. The sensor is a multi-spectral near-infrared sensor capable of measuring 64 wavelength. Data processing and analysis are performed using a set of four machine learning based computational methods (Random Forest, Support Vector Machines, Multi-Layer Perceptron, Convolutional Neural Networks). Validation is performed by collecting data samples from 6 different types of waste plastics found in household recycling and virgin materials. The results show that Convolutional Neural Networks and Support Vector Machines achieve the highest recognition accuracy of 62.08% with waste plastics and 54.72% with virgin plastics, respectively. The results show how this low-cost multi-spectral near-infrared sensor and machine learning can be effective in plastic recognition tasks and potentially enables to create new applications in other fields that require affordable and portable solutions such as in agriculture, e-waste recycling, healthcare and manufacturing.
Original languageEnglish
Title of host publication2023 IEEE SENSORS
PublisherIEEE
Number of pages4
ISBN (Electronic)9798350303872
ISBN (Print)9798350303889
DOIs
Publication statusE-pub ahead of print - 28 Nov 2023
EventIEEE Sensors 2023 - Hilton Vienna Park, Vienna, Austria
Duration: 29 Oct 20231 Nov 2023
https://doi.org/10.1109/SENSORS56945.2023

Conference

ConferenceIEEE Sensors 2023
Country/TerritoryAustria
CityVienna
Period29/10/231/11/23
Internet address

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 12 - Responsible Consumption and Production
    SDG 12 Responsible Consumption and Production

Fingerprint

Dive into the research topics of 'Towards Low-cost Plastic Recognition using Machine Learning and Multi-spectra Near-infrared Sensor'. Together they form a unique fingerprint.

Cite this