Please use this identifier to cite or link to this item: http://localhost:8080/xmlui/handle/123456789/1037
Full metadata record
DC FieldValueLanguage
dc.contributor.authorVenugopal Raskatla, Purnesh Singh Badavath and-
dc.contributor.authorVijay Kumar-
dc.date.accessioned2024-10-07T05:15:53Z-
dc.date.available2024-10-07T05:15:53Z-
dc.date.issued2022-03-31-
dc.identifier.citationDOI: 10.1117/1.OE.61.3.036114en_US
dc.identifier.urihttp://localhost:8080/xmlui/handle/123456789/1037-
dc.description.abstractMachine learning has emerged as a powerful tool for physicists for building empirical models from the data. We exploit two convolutional networks, namely Alexnet and wavelet scattering network for the classification of orbital angular momentum (OAM) beams.We present a comparative study of these two methods for the classification of 16 OAM modes having radial and azimuthal phase profiles and eight OAM superposition modes with and without atmospheric turbulence effects. Instead of direct OAM intensity images, we have used the corresponding speckle intensities as an input to the model. Our study demonstrates a noise and alignment-free OAM mode classifier having maximum accuracy of >94% and >99% for with and without turbulence, respectively. The main advantage of this method is that the mode classification can be done by capturing a small region of the speckle intensity having a sufficient number of speckle grains. We also discuss this smallest region that needs to be captured and the optimal resolution of the detector required for mode classificationen_US
dc.language.isoenen_US
dc.publisherSPIEen_US
dc.subjectconvolutional neural networken_US
dc.subjectwavelet scattering transformen_US
dc.subjectmachine learningen_US
dc.subjectspeckleen_US
dc.titleConvolutional networks for speckle-based orbital angular momentum modes classificationen_US
dc.typeArticleen_US
Appears in Collections:Physics

Files in This Item:
File Description SizeFormat 
2-OE_61_0361141_2022.pdf1.72 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.