Department of Food Science and Technology, Punjab Agricultural University, Ludhiana, India
Sensory science has witnessed increased adoption of technological advancements in recent years. Food analysis using human senses severely impacted the evaluation responses due to errors and the complexity of the assessment methods. Hence, the adoption of tools capable of mimicking human senses is considered a more viable approach. This article provides a critical demonstration of the developments in sensory science detailing the technology behind the construction and working of the electronic tongue, electronic nose, and electronic eye. The paper also attempts to brief the industrial applications of artificial senses and the fusion technique in monitoring as well as assessment of food quality.
Key words: artificial senses, e-eye, e-nose, e-tongue, fusion technique, sensory evaluation
*Corresponding Author: Kamaljit Kaur, Department of Food Science and Technology, Punjab Agricultural University, Ludhiana, India. Email: [email protected]
Received: 25 December 2021; Accepted: 15 April 2022; Published: 10 June 2022
© 2022 Codon Publications
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0). License (http://creativecommons.org/licenses/by-nc-sa/4.0/)
The concept of sensory science dates back to the 1940s when food acceptance methodologies were developed based on customer or hedonic scales (Tuorila and Monteleone, 2009). The evolution of the field led to a more sophisticated approach where sensory science is defined as “a scientific method used to evoke, measure, analyze, and interpret those responses to products as perceived through the senses of sight, smell, touch, taste, and hearing” (Lawless and Heymann, 2010, pp. 1–2). Among various sensory evaluation methods, subjective analysis comprising descriptive and discriminative tests is mostly employed in the food industry (Yang and Lee, 2019). However, reliance on human senses for sensory evaluation has limited the applicability of subjective analysis. The various limitations encountered can be summarized as the increased probability of human errors in perceiving the complex food–receptor interactions, the influence of the external environment, and insubstantial testing of food samples (Marques et al., 2022).
The latest technological advancements suggested a more feasible approach which is artificial senses. The concept of artificial senses is based on the ability of instruments to mimic the human senses (Ghasemi-Varnamkhasti et al., 2010). Some of these instruments are electronic eye (e-eye), electronic nose (e-nose), and electronic tongue (e-tongue), mimicking the three major human senses, namely, vision, smell, and taste. Apart from overcoming the limitations of subjective analysis, the striking features of these instruments are consistency, easy calibration, and efficient result interpretations. Furthermore, these tools can be used independently as well as in conjunction with each other (Rodriguez-Mendez and Preddy, 2016).
An electronic nose (e-nose) is designed to detect and distinguish between the complex odors of aroma components of a food sample. It consists of a wide variety of heterogeneous gas sensors with partial specificity and a pattern recognition system (Tan and Xu, 2020). The extensive application of e-nose has contributed to advances in sensors, improvements in materials, innovative software, and progress in designs, circuits, and systems (Rodríguez-Mendez et al., 2016). Currently, e-nose is widely used in various industries including food, pharmaceuticals, and other scientific research fields (Yakubu et al., 2021).
Electronic tongue (e-tongue) also acclaimed as an artificial tongue or as a taste sensor can be defined as an analytical tool that is used to classify various tastes of different chemical substances in a liquid sample (Jain et al., 2010). It is commonly being used in medical diagnosis, microbial analysis, environmental monitoring, and food technology (Tan and Xu, 2020). E-tongue has the advantages of measuring toxic substances, conducting objective analysis, and having no detection fatigue (Jiang et al., 2018).
Electronic eye (e-eye) can be explained with a system known as computer image analysis or computer vision. Computer vision mimics human vision by developing a theoretical groundwork and algorithms for the extraction and analysis of useful information about an object (Hong et al., 2014). It deals with the visual inspection of quality characteristics of food products by capturing, processing, and analyzing images. Due to its cost-effectiveness, superior speed, accuracy, and consistency, e-eye has been successfully used for quality analysis of meat, fish, pizza, cheese, bread, and cereal grains (Koyama et al., 2021).
The excellent capabilities of these artificial senses to detect, recognize, and discriminate complex mixtures of chemicals have led to development in electronic analogs of biological systems (Amoli et al., 2019). These tools have proven to be successful for quality assessment, shelf-life studies, and sensory evaluation of various food products individually. However, the fusion technique which is based on a combination of artificial senses along with other analytical instruments can form a powerful and objective analytical tool able to out-perform the individual artificial senses (Di Rosa et al., 2018).
This article reviews the principle, construction, and working of sensory analytical tools; critically analyzes the limitations of these tools; and discusses potential applications of the fusion technique in the food industry.
The basic elements of objective sensory analytical tools are an array of sensors that sense the material, data acquisition, and data processing systems to generate a response, and a display system to convert responses to recordable digital information.
Sensors are the crucial elements of the electronic sensing system. In this system, the input signal is converted to an output signal in the form of electric, chemical, magnetic, thermal, or radiation energy (Magdalena et al., 2014).
Broadly, sensors used in e-nose and e-tongue can be divided into five groups. Electrochemical Sensors measure changes in voltage, electric current, impedance, or changes in conductivity. These include potentiometric, voltammetric, amperometric, impedimetric, and conductometric sensors (Jiang et al., 2018). Piezoelectric Sensors utilize absorption or adsorption of substrate components (such as lithium niobite, quartz crystal, and ZnO) as input signals and measure their impact on resonance and electric current. Since most of the samples tested using these sensors are odorants, these sensors find utilization in e-noses (Go et al., 2017). Colorimetric Sensors are based on the phenomenon of electromagnetic radiation such as fluorescence, absorbance, or reflection. The changes in spectral properties due to the interaction of food components with sensors are used as an indicator (Ajay Priya et al., 2017). Biosensors rely on the interaction of biological components or biomolecules with target analytes. These sensors are usually combined to form a biosensing analytical system and help in the detection of enzymes, microbes, or other food components (Tan and Xu, 2020). Other sensors are based on chromatographic or spectroscopic principles.
However, the sensors used in the case of e-eye are ultrasound, X-ray, charged-coupled devices, etc. These sensors create images of the object. Thermal imaging cameras and terahertz cameras can also be used for image analysis. The major factors responsible for high-quality image are illumination and lightning arrangement. Light sources vary according to the purpose of use and various sources are incandescent, fluorescent, laser, infrared lamps, and X-ray tubes (Koyama et al., 2021; Tan and Xu, 2020).
The changes to sensors are recorded as data and the complex data gathered are interpreted through multivariate statistical analysis such as Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Discriminant Function Analysis (DFA), Partial Least Squares (PLS), Soft Independent Modeling Class Analogy (SIMCA), etc. These methods are discussed in detail by Magdalena et al. (2014) and Jiang et al. (2018).
Depending on the test data being analyzed or the aim of data interpretation, these analyses have a specific application. For example, PCA is mostly used to compress, model, and visualize multivariate data, whereas PLS is a variant of PCA, in which the response variable is predicted. In the SIMCA method, a different model is designed for each class that is based on the principal component approach (Magdalena et al., 2014). Furthermore, HCA is used to cluster territorial units toward homogeneous regions. The objective of LDA is to design linear discriminant tasks for samples in the model set that pertain to specific groups. In DFA, data are classified with a discriminant prediction equation to observe the differences among or between groups (Lee and Lee, 2014). In case of nonlinear responses, Artificial Neural Network (ANN) can also be used for data modeling.. An artificial neural network imitates a biological neural network that collects and conveys signals to the central nervous system and then processes the data and later makes specific decisions that depends on the identified objects (Baldwin et al., 2011; Gowen et al., 2010).
E-nose usually consists of a set of sensors, electronic components, pumps, a flow meter, and data analysis software (Tan et al., 2019). The first element of the e-nose system is the sample handling system which is used for introducing the volatile components of the sample. This system involves different sampling procedures such as headspace sampling, bubblers, diffusion methods, or pre concentrators (Miguel and Laura, 2009). The second element is a detection system that consists of an array of gas sensors. These sensors can convert the chemical quantity into an electrical signal that is correlated to the concentration of specific particles such as atoms, molecules, ions, or gases. The various kinds of electrochemical sensors used for e-nose are metal oxide gas sensors, fiber-optic gas sensors, etc. (Koyama et al., 2021; Miguel and Laura, 2009; Yakubu et al., 2021). Finally, the third element is the data analysis and processing system helping in interpretation of the data generated by the detection system (Tan et al., 2019).
The basic backbone of e-tongue is the sensor array. The sensor array consists of varied sensors that generate an output signal to be processed by software. The e-tongue is developed based on electrochemical sensors incorporating electrode system. There are three electrodes, namely, working electrode, reference electrode, and auxiliary electrode (Jiang et al., 2018). Another model developed by the University of Texas was based on illumination where the sensory array was based on silicon polymer beads (Jain et al., 2010).
A computer vision system generally consists of five basic components: a lighting system, a camera, an image capture board (frame grabber or digitizer), a personal computer, and software. Figure 1 depicts the Schematic structure of image analysis.
Figure 1. Schematic structure of computer vision (Hong et al., 2014).
The lighting system plays a major role in capturing images and it must provide a consistent picture by eliminating the appearance of variations, etc. Lighting arrangements can be front or back lighting where front lighting helps in the detection of external surface characteristics and back lighting enhances the background of objects. Back lighting is used for the detection of translucent objects such as hatching eggs or for measuring the geometric dimensions of obscured objects like shoot height, pine diameter, etc. (Patel et al., 2011).
Digital cameras are used for image capturing. The image capture board is used for converting pictorial images into numerical form, called digitization (process of dividing images into pixels using a frame grabber). A personal computer or microprocessor system helps in data storage. The system also provides a high-resolution display unit called LCD. It aids in analyzing images and effects of various analysis (Kodagali and Balaji et al., 2012).
The basic idea behind the working of these analytical tools is a reversible physical, chemical, or electrical change produced in the sensing material due to interactions with the sample components These changes generate data (responses) which are recorded, analyzed, and interpreted for measuring the sensorial characteristics (Pereira et al., 2021).
The most common means of using transduction principles are based on electrical measurements where changes in voltage, current, impedance, or electrical fields are measured. However, measurements of mass or physical change, temperature change, or heat generation are also used. Optical sensors measure the modulation in light-related properties like color, wavelength, etc. (Miguel and Laura, 2009; Pereira et al., 2021).
The analysis in e-nose takes place by placing the sample into an airtight sampling chamber (Figure 2).
Figure 2. Working of an e-nose (Tan and Xu, 2020).
In this chamber, the product should be kept for at least 1 h so that the volatile components are released. Then, the aliquot of sample headspace is pumped onto sensors using a syringe. The measurement phase accounts for about 60 s and then the standby phase is activated. In the standby phase, the system is cleansed by pumping fresh air and the sensor is reset to baseline level (Tan et al., 2019). Air first crosses the sensor array and then reaches the concentration chamber. The signals so produced are recorded using transducer recording devices. These changes are then preprocessed and conditioned by various pattern recognition systems (Gomez et al., 2007; Tan and Xu, 2020).
The e-tongue working can be explained by two different models based on difference in the sensor arrays used. The first model relies on electrochemical sensors and an electrode system (Figure 3).
Figure 3. Working of an e-tongue (Leon-Madina et al., 2019).
The working electrode is an inert material like platinum, glassy carbon, gold, iridium, rhodium, etc. It is focused on surface and electrochemical interactions that occur on this surface. The reference electrode is used for measuring the potential of the working electrode. Usually, an Ag/AgCl material is used as a reference electrode as the electrode functions as a redox electrode and the equilibrium is between the silver metal (Ag) and its salt—silver chloride (AgCl, also called silver(I) chloride) and hence the reference electrode stabilizes the voltage without adding onto the potential of the working electrode. The auxiliary electrode completes the cell circuit. A conductor of stainless steel is used here. For connecting these electrodes, a relay box is used (Jiang et al., 2018; Leon-Madina et al., 2019).
The e-tongue design was developed by the University of Texas utilizing image processing. It consisted of a light source, a sensor array, and a detector. The light source illuminates the sensor chip which is composed of polymer beads arranged on a small silicon wafer. The chemically adapted polymer beads experience a color change depending on the presence and quantity of specific chemicals. The color change is then captured using digital camera and the transmitted signal is converted into data using a video capture board and computer (Jain et al., 2010).
The basic mechanism behind the working of computer vision includes image acquisition, preprocessing, segmentation, representation, and recognition. Image acquisition aims at converting the electronic signals from a sensing device to a numeric form. The preprocessing stage involves different operations such as noise reduction, correction of gray level, geometrical correction, and correction of defocusing images. The goal of this stage is to enhance the quality of image. Furthermore, segmentation focuses on partitioning a digital image into multiple segments known as super-pixels for simplifying and/or changing the representation of an image for easier analysis. Representation and description are carried out using statistical classifiers or multilayer neural networks. Finally, recognition and interpretation of results is carried out by using various techniques such as Fuzzy logic, decision tree, and genetic algorithms (Du and Sun, 2006).
Fuzzy logic is established on “degrees of truth” rather than usual “true or false” (0 or 1). However, in a decision tree, each node represents a feature (attribute), link (branch) represents a decision (rule), and leaf represents an outcome (categorical or continues value). Genetic Algorithms (GAs) are adaptive search algorithms based on the evolutionary ideas of natural selection and genetics (Jin et al., 2013; Koyama et al., 2021).
The principle behind the design of these tools is focused on the fact that these tools mimic biological systems. In human nose, the olfactory neurons respond to several odorants and they are detected by the olfactory neurons in the brain (Yakubu et al., 2021). Similarly, in e-nose, the sensors detect the odorants by converting the sensor’s response to a digital signal. These responses are called fingerprints. Similar responses have the same fingerprints and different aromas will differ in their fingerprints easing the detection of odors (Milna et al., 2014).
In humans, the taste buds contain receptors that identify the flavor-imparting components and trigger the nerve signals. Taste occurs due to complex network of chemical reactions that activates the neuronal circuits (Mattes and Popkin, 2009). Each taste sensation can account for the fingerprint signal which is generated due to the activation of different taste receptors. Similarly, the basic principle of e-tongue is to combine the signals from specific, overlapping, or nonspecific sensors with pattern recognition systems (Rodriguez-Mendez et al., 2010).
The basic principle behind computer vision is to capture images using a digital camera and then process those images using data processing techniques. In humans, the eye acts as a digital camera, and the brain acts as the data processing system (Chouhan et al., 2020).
Electronic nose (e-nose), electronic tongue (e-tongue), and computer vision system (e-eye) are three analytical systems that have been used separately in the food and pharmaceutical industries as quality determination methods (Jiang et al., 2018; Kakani et al., 2020; Karakaya et al., 2019). These three systems are nondestructive, rapid, consistent, and economic; their fusion can form a powerful and objective inspection tool able to outperform the individual techniques. The combined evaluation technique has a variety of applications replacing the system that may not have sufficient performance individually for specific uses (Kiani et al., 2016).
Novel methods combining artificial senses can rapidly achieve precise results compared with the individual sensors (Apetrei et al., 2010). The use of multivariate analysis methods, together with objective senses, proved to be very powerful. In recent years, much research has been performed to develop several data fusion strategies, combining the output of multiple instrumental sources for food quality improvement (Di Rosa et al., 2017). Figure 4 represents the data fusion strategies used in the field of food quality assessment.
Figure 4. Data fusion strategies used in the food quality assessment (Borras et al., 2015).
A composite e-tongue based on data fusion of two distinct sensor families was generated and used by Haddi et al. to observe three types of beers (Haddi et al., 2011). Three modified graphite-epoxy voltameter sensors plus six potentiometric sensors were combined to make a sensor array. The sensor array integrated with pattern recognition and feature extraction methods, namely, discriminant factor analysis (DFA) and PCA was trained to categorize data clusters related to different types of beer. Qui et al. used e-tongue and e-nose and their fusion to identify strawberry juices produced with different processing methods (Qiu et al., 2015). They observed that e-tongue and e-nose could be used separately for strawberry juice classification but they are not sufficient to determine the values of all quality parameters. Table 1 summarizes a few applications of the fusion technique in food quality analysis. Di Rosa et al. analyzed the data from single modality and fusion methods to classify different Sicilian honey varieties (Di Rosa et al., 2018). Combining potentiometric electronic tongue and computer vision system resulted in a fast, simple, and inexpensive tool for the discrimination of honey’s botanical origin. Tretola et al. investigated safety features of selected former food products (FFPs) intended for animal nutrition for the presence of undesired ingredients which can be identified as remnants of packaging materials by stereomicroscopy coupled with a computer vision system (Tretola et al., 2017a, 2017b) They reported that an innovative computer vision system demonstrates the possibility of rapid detection of the presence of packaging remnants in FFPs.
Table 1. Applications of fusion technique in food quality analysis.
|Sample||Observed quality parameter||Fusion system used||Reference|
|Fruit juices||Classification of juices||E-nose and E-tongue||Kiani et al. (2016)|
|Fish||Evaluation of freshness, firmness, and quality||Computer vision system and NIRS (near infrared spectroscopy)||Kiani et al. (2016)|
|Honey||Identification of its botanical origin||Potentiometric electronic tongue and CVS||Di Rosa et al. (2017)|
|Packaging remnants||Former food products’ safety||Stereomicroscopy coupled with computer vision system||Tretola et al. (2017a)|
|Pork||Quantification of total viable count||Hyperspectral imaging with colorimetric sensors||Li et al. (2016)|
|Pork||Detection of total volatile basic nitrogen (TVB-N)||Hyperspectral imaging with odor imaging sensors||Li et al. (2015)|
|Pork||Indicator of freshness||NIRS (near infrared spectroscopy) and multispectral imaging technique (MSI)||Huang et al. (2015)|
|Pork||Evaluating freshness||NIRS (near infrared spectroscopy), computer vision (CV), and electronic nose (e-nose)||Huang et al. (2014)|
|Rice wine||Food sensory quality||e-eye, e-nose, and e-tongue by combining multiple cross perception sensors||Ouyang et al. (2014)|
|Chestnut||Identification of chestnut in terms of classification, sensitivity, and specificity||Sensory and FTNIR spectral data||Corona et al. (2021)|
|Brazilian Monovarietal Olive Oil||Storage effects on the Brazilian monovarietal extra virgin olive oil||Data fusion and chemometrics||Gonçalves et al. (2020)|
Electronic mucosa (an artificial olfactory microsystem) is a novel technology that mimics the basic structure of the mammalian olfactory system to supply spatial and temporal chemosensory information. Ghasemi-Varnamkhasti and Aghbashlo (2014) reported the opportunities and challenges in the application of electronic nose and electronic mucosa as a creative instrument in drying technology. An inventive kind of electronic nose that tries to imitate nasal chromatograph effect with further useful information content in a higher level of identification as compared with the existing electronic nose systems could be generated. A typical electronic mucosa consists of three large arrays of sensors with two retentive columns. In this device, all sensor arrays furnish spatial information like the common electronic nose, but with numerous sensors, and the second and third arrays deliver temporal profiles different from the first array and each other (Che Harun et al., 2009).
A fusion model of an odor sensor and the highly advanced optical sensor was used for the evaluation of total volatile basic nitrogen (TVB-N) content in chicken meat. The aroma and odor data variables were collected using the odor sensor, that is, the colorimetric sensor and the spectral as well as textural data variables were obtained from the optical sensor, that is, the HIS. The results encouraged multiple sensor fusion for better model performance to evaluate chicken meat’s freshness (Khulal et al., 2017).
Volatile profiles of 10 different varieties of fresh jujubes were characterized using a fusion model of HS-SPME/GC-MS (headspace solid-phase micro-extraction combined with gas chromatography-mass spectrometry) and e-nose (electronic nose). A total of 51 aroma compounds were identified in jujubes. Hexanoic acid, hexanal, (E)-2-hexenal, (Z)-2-heptenal, benzaldehyde, and (E)-2-nonenal were the main aroma components. Cluster analysis for GC-MS, principal component analysis, and linear discriminant analysis for e-nose data were conducted for differentiation of jujube varieties. The study concluded that the selection of fusion models was relevant for the characterization and differentiation of volatile profiles (Chen et al., 2018).
NIRS was used for the rapid detection and quantification of pure volatile organic compounds (ethanol, ethyl acetate, and acetic acid) from mixed VOCs by employing sensitive intermediary chemo-responsive dyes as capture probes. NIRS spectra were scanned, preprocessed, and used to build partial least squares (PLS) prediction models. The study came up with the conclusion that the usage scope of NIRS and e-nose-based colorimetric sensors for rapid detection and quantification of VOCs content in materials can be widened (Kutsanedzie et al., 2018).
The tools were designed to reduce the tedium of the sensory process conducted by human panelists. The designing of these analytical tools has created excellent nondestructive methods that have been successfully used for the determination of major and minor food components. However, certain limitations persist that can be improved for future industrial use (Calvini and Pigani, 2022).
Firstly, the baseline of the measurements used in analytical tools, that is, the physiochemical changes that are responsible for modification of the stability, organoleptic, and/or typicality of food products are mostly unexplained by traditional chemistry. Also, the cumbersome data pretreatment, different scale of signal, and possible instrument drift and baseline shifts pose a noticeable concern in this field (Smyth and Cozzolino, 2013).
Another disadvantage is the environmental sensitivity of these tools. A certain change in temperature and moisture leads to sensor drift. The analytical tools come along with a need for more extensive research that can identify various types of sensors that best fit for the detection of a particular type of food component as well as the best statistical method for processing of data which can refine the practical applications of these tools in the industry to a reliable level (Milna et al., 2014).
Fusion technique has been applied for the collective assessment of the food quality attributes, but this system is complex. It cannot be used effectively for the differentiation and sampling of individual food components. Hence, an intensive investigation is required to address the wider applications of this technique (Di Rosa et al., 2017).
The growing era of technological innovation has gifted the field of sensory science with artificial senses and fusion technique. The conventional method of reliance on human senses for predicting the sensorial characteristics of food created havoc in results interpretation. The major challenge associated with subjective sensory analysis was increased chances of human error. With ability to mimic the human senses, e-tongue, e-nose, e-eye, and fusion technique are capable of overcoming the repercussions of subjective sensory analysis. This review focused on describing the working and construction of these tools and has discussed the various applications of these devices in the food industry. Analytical tool fusion ushers in a system with comprehensive and complementary information about the foods that increase the performance of every instrument when applied individually. Hence, it can be concluded that future-oriented growth and research in this field will allow a better understanding of these devices leading to an increased implementation in the food industry.
Ajay Priya, V.S, Joseph, P., Kiruba Daniel, S.C.G., Lakshmanan, S., Kinoshita, T. and Muthusamy, S., 2017. Colorimetric sensors for rapid detection of various analytes. Materials Science and Engineering: C 78: 1231–1245. 10.1016/j.msec.2017.05.018
Amoli, V., Kim, S.Y., Kim, J.S., Choi, H., Koo, J. and Kim, D.H., 2019. Biomimetics for high-performance flexible tactile sensors and advanced artificial sensory systems. Journal of Materials Chemistry C 7(47): 14816–14844. 10.1039/C9TC05392A
Apetrei, C., Apetrei, I.M., Villanueva, S., De Sajab, J.A., Gutierrez-Rosalesc, F. and Rodriguez-Mendez, M.L., 2010. Combination of an e-nose, an e-tongue and an e-eye for the characterization of olive oils with different degree of bitterness. Analytica Chimica Acta 663: 91–97. 10.1016/j.aca.2010.01.034
Baldwin, E.A., Bai, J., Plotto, A. and Dea, S., 2011. Electronic noses and tongues: applications for the food and pharmaceutical industries. Sensors 11: 4744–4766. 10.3390/s110504744
Borras, E., Ferre, J., Boque, R., Mestres, M., Acena, L. and Busto, O., 2015. Data fusion methodologies for food and beverage authentication and quality assessment–a review. Analytica Chimica Acta 891: 1–14. 10.1016/j.aca.2015.04.042
Calvini, R. and Pigani, L., 2022. Toward the development of combined artificial sensing systems for food quality evaluation: a review on the application of data fusion of electronic noses, electronic tongues and electronic eyes. Sensors 22(2): 577. 10.3390/s22020577
Che Harun, F.K., Taylor, J.E., Covington, J.A. and Gardner, J.W., 2009. An electronic nose employing dual-channel odor separation columns with large chemosensor arrays for advanced odour discrimination. Sensors and Actuators B 141: 134–140. 10.1016/j.snb.2009.05.036
Chen, Q., Song, J., Bi, J., Meng, X. and Wu, X., 2018. Characterization of volatile profile from ten different varieties of Chinese jujubes by HS-SPME/GC–MS coupled with E-nose. Food Research International 105: 605–615. 10.1016/j.foodres.2017.11.054
Chouhan, S.S., Singh, U.P. and Jain, S., 2020. Applications of computer vision in plant pathology: a survey. Archives of Computational Methods in Engineering 27(2): 611–632. 10.1007/s11831-019-09324-0
Corona, P., Frangipane, M.T., Moscetti, R., Lo Feudo, G., Castellotti, T. and Massantini, R., 2021. Chestnut cultivar identification through the data fusion of sensory quality and FT-NIR spectral data. Foods 10: 2575. 10.3390/foods10112575
Di Rosa, A.R., Leone, F., Cheli, F. and Chiofalo, V., 2017. Fusion of electronic nose, electronic tongue and computer vision for animal source food authentication and quality assessment–a review. Journal of Food Engineering 210: 62–75. 10.1016/j.jfoodeng.2017.04.024
Di Rosa, A.R., Leone, F., Scattareggia, C. and Chiofalo, V., 2018. Botanical origin identification of Sicilian honeys based on artificial senses and multi-sensor data fusion. European Food Research and Technology 244: 117–125. 10.1007/s00217-017-2945-8
Du, C.J. and Sun, D.W., 2006. Learning techniques used in computer vision for food quality evaluation: a review. Journal of Food Engineering 72: 39−55. 10.1016/j.jfoodeng.2004.11.017
Ghasemi-Varnamkhasti, M. and Aghbashlo, M., 2014. Electronic nose and electronic mucosa as innovative instruments for real-time monitoring of food dryers. Trends in Food Science and Technology 38: 158–166. 10.1016/j.tifs.2014.05.004
Ghasemi-Varnamkhasti, M., Mohtasebi, S.S. and Siadat, M., 2010. Biomimetic-based odor and taste sensing systems to food quality and safety characterization: an overview on basic principles and recent achievements. Journal of Food Engineering 100: 377–387. 10.1016/j.jfoodeng.2010.04.032
Go, D.B., Atashbar, M.Z., Ramshani, Z. and Chang, H.C., 2017. Surface acoustic wave devices for chemical sensing and microfluidics: a review and perspective. Analytical Methods 9(28): 4112–4134. 10.1039/C7AY00690J
Gomez, A.H., Wang, J., Hu, G. and Pereira, A.G., 2007. Discrimination of storage shelf-life for mandarin by electronic nose technique. LWT-Food Science and Technology 40: 681–689. 10.1016/j.lwt.2006.03.010
Goncalves, T.R., Rosa, L.N., Torquato, A.S., Da Silva, L.F., Março, P.H., Gomes, S.T.M., Matsushita, M. and Valderrama, P., 2020. Assessment of Brazilian monovarietal olive oil in two different package systems by using data fusion and chemometrics. Food Analytical Methods 13: 86–96. 10.1007/s12161-019-01511-w
Gowen, A.A., Tiwari, B.K., Cullen, P.J., McDonnell, K. and O’Donnell, C.P., 2010. Applications of thermal imaging in food quality and safety assessment. Trends in Food Science and Technology 21: 190–200. 10.1016/j.tifs.2009.12.002
Haddi, Z., Amari, Z., Bouchikhi, B., Gutierrez, J.M., Cetoc, X., Mimendia, A. and Vallem, M.D., 2011. Data fusion from voltammetric and potentiometric sensors to build a hybrid electronic tongue applied in classification of beers. Proceedings of the 14th International Symposium on Olfaction and Electronic Nose. AIP Conference Proceedings 1362: 189–190. 10.1063/1.3626353
Hong, H., Yang, X., You, Z. and Cheng, F., 2014. Visual quality detection of aquatic products using machine vision. Aquacultural Engineering. 63: 62–71. 10.1016/j.aquaeng.2014.10.003
Huang, L., Zhao, J., Chen, Q. and Zhang, Y., 2014. Nondestructive measurement of total volatile basic nitrogen (TVB-N) in pork meat by integrating near infrared spectroscopy, computer vision and electronic nose techniques. Food Chemistry 145: 228–236. 10.1016/j.foodchem.2013.06.073
Huang, Q., Chen, Q., Li, H., Huang, G., Ouyang, Q. and Zhao, J., 2015. Non-destructively sensing pork’s freshness indicator using near infrared multispectral imaging technique. Journal of Food Engineering 154: 69–75.
Jain, H., Panchal, R., Pradhan, P., Patel, H. and Pasha, T.Y., 2010. Electronic tongue: a new taste sensor. International Journal of Pharmaceutical Sciences Review and Research 5: 91–96.
Jiang, H., Zhang, M., Bhandari, B. and Adhikari, B., 2018. Application of electronic tongue for fresh foods quality evaluation: a review. Food Reviews International 34(8): 746–769. 10.1080/87559129.2018.1424184
Jin, W., Ho, H.L., Cao, Y.C., Ju, J. and Qi, L.F., 2013. Gas detection with micro-and nano-engineered optical fibers. Optical Fiber Technology 19: 741–759. 10.1016/j.yofte.2013.08.004
Kakani, V., Nguyen, V.H., Kumar, B.P., Kim, H. and Pasupuleti, V.R., 2020. A critical review on computer vision and artificial intelligence in food industry. Journal of Agriculture and Food Research 2: 100033. 10.1016/j.jafr.2020.100033
Karakaya, D., Ulucan, O. and Turkan, M., 2019. Electronic nose and its applications: a survey. International Journal of Automation and Computing 17(2): 179–209. 10.1007/s11633-019-1212-9
Khulal, U., Zhao, J., Hu, W. and Chen, Q., 2017. Intelligent evaluation of total volatile basic nitrogen (TVB-N) content in chicken meat by an improved multiple level data fusion model. Sensors and Actuators B 238: 337–345. 10.1016/j.snb.2016.07.074
Kiani, S., Minaei, S. and Ghasemi-Varnamkhasti, M., 2016. Fusion of artificial senses as a robust approach to food quality assessment. Journal of Food Engineering 171: 230–239. 10.1016/j.jfoodeng.2015.10.007
Kodagali, J.A. and Balaji, S., 2012. Computer vision and image analysis based techniques for automatic characterization of fruits—a review. IJCA—International Journal of Computer Applications 50: 6–12. 10.5120/7773-0856
Koyama, K., Tanaka, M., Cho, B.H., Yoshikawa, Y. and Koseki, S., 2021. Predicting sensory evaluation of spinach freshness using machine learning model and digital images. PLoS One 16(3): e0248769. 10.1371/journal.pone.0248769
Kutsanedzie, F.Y., Hao, L., Yan, S., Ouyang, Q. and Chen, Q., 2018. Near infrared chemo-responsive dye intermediaries spectra-based in-situ quantification of volatile organic compounds. Sensors and Actuators B 254: 597–602. 10.1016/j.snb.2017.07.134
Lawless, H.T. and Heymann, H., 2010. Introduction. In: Lawless H.T., Heymann H. (eds.) Sensory evaluation of food: principles and practices. 2nd ed. Springer Science and Business Media, New York, NY, pp. 1–2.
Lee, W.H. and Lee, W., 2014. Food inspection system using terahertz imaging. Microwave and Optical Technology Letters 56: 1211–1214. 10.1002/mop.28303
Leon-Medina, J.X., Cardenas-Flechas, L.J. and Tibaduiza, D.A., 2019. A data-driven methodology for the classification of different liquids in artificial taste recognition applications with a pulse voltammetric electronic tongue. International Journal of Distributed Sensor Networks 15(10): 1550147719881601. 10.1177/1550147719881601
Li, H., Chen, Q., Zhao, J. and Wu, M., 2015. Nondestructive detection of total volatile basic nitrogen (TVB-N) content in pork meat by integrating hyperspectral imaging and colorimetric sensor combined with a nonlinear data fusion. LWT-Food Science and Technology 63: 268–274. 10.1016/j.lwt.2015.03.052
Li, H., Kutsanedzie, F., Zhao, J. and Chen, Q., 2016. Quantifying total viable count in pork meat using combined hyperspectral imaging and artificial olfaction techniques. Food Analytical Methods 9: 3015–3024. 10.1007/s12161-016-0475-9
Magdalena, S., Paulina, W., Tomasz, D., Jacek, N. and Waldemar, W., 2014. Food analysis using artificial senses. Journal of Agricultural and Food Chemistry 62: 1423–1448. 10.1021/jf403215y
Marques, C., Correia, E., Dinis, L.T. and Vilela, A., 2022. An overview of sensory characterization techniques: from classical descriptive analysis to the emergence of novel profiling methods. Foods 11(3): 255. 10.3390/foods11030255
Mattes, R.D. and Popkin, B.M., 2009. Nonnutritive sweetener consumption in humans: effects on appetite and food intake and their putative mechanisms. The American Journal of Clinical Nutrition 89: 1–14. 10.3945/ajcn.2008.26792
Miguel, P. and Laura, E.G., 2009. A 21st century technique for food control: electronic noses. Analytical Chimica Acta 638: 1–15. 10.1016/j.aca.2009.02.009
Milna, T.K., Ksenija, M., Samir, K., Nada, V. and Jasmina, H., 2014. Application of electronic nose and electronic tongue in the dairy industry. Mljekarstvo 64: 228–244. 10.15567/mljekarstvo.2014.0402
Ouyang, Q., Zhao, J. and Chen, Q., 2014. Instrumental intelligent test of food sensory quality as mimic of human panel test combining multiple cross-perception sensors and data fusion. Analytica Chimica Acta 841: 68–76. 10.1016/j.aca.2014.06.001
Patel, K.K., Kar, A., Jha, S.N. and Khan, M.A., 2011. Machine vision system: a tool for quality inspection of food and agricultural products. Journal of Food Science and Technology 42: 123–141. 10.1007/s13197-011-0321-4
Pereira, P.F., de Sousa Picciani, P.H., Calado, V. and Tonon, R.V., 2021. Electrical gas sensors for meat freshness assessment and quality monitoring: a review. Trends in Food Science and Technology 118: 36–44. 10.1016/j.tifs.2021.08.036
Qiu, S., Wang, J., Tang, C. and Du, D., 2015. Comparison of ELM, RF, and SVM on E-nose and E-tongue to trace the quality status of mandarin (Citrus unshiu Marc.). Journal of Food Engineering 166: 193–203. 10.1016/j.jfoodeng.2015.06.007
Rodríguez-Méndez, M.L., Apetrei, C. and De Saja, J.A., 2010. Electronic tongues purposely designed for the organoleptic characterization of olive oils. In: Olives and olive oil in health and disease prevention. Elsevier: Amsterdam, The Netherlands, pp. 525–532.
Rodríguez-Mendez, M.L. and Preddy, V., 2016. Electronic noses and tongues in food science. Academic Press, London.
Smyth, H. and Cozzolino, D., 2013. Instrumental methods (spectroscopy, electronic nose, and tongue) as tools to predict taste and aroma in beverages: advantages and limitations. Chemical Reviews 113: 1429–1440. 10.1021/cr300076c
Tan, J., Balasubramanian, B., Sukha, D., Ramkissoon, S. and Umaharan, P., 2019. Sensing fermentation degree of cocoa (Theobroma cacao L.) beans by machine learning classification models based electronic nose system. Journal of Food Process Engineering 42(6): e13175. 10.1111/jfpe.13175
Tan, J. and Xu, J., 2020. Applications of electronic nose (e-nose) and electronic tongue (e-tongue) in food quality-related properties determination: a review. Artificial Intelligence in Agriculture 4: 104–115. 10.1016/j.aiia.2020.06.003
Tretola, M., Di Rosa, A.R., Tirloni, E., Ottoboni, M., Giromini, C., Leone, F., Bernardi, C.E.M., Dell’Orto, V., Chiofalo, V. and Pinotti, L., 2017a. Former food products safety: microbiological quality and computer vision evaluation of packaging remnants contaminations. Food Additives and Contaminants Part A 34: 1427–1435. 10.1080/19440049.2017.1325012
Tretola, M., Ottoboni, M., Di Rosa, A.R., Giromini, C., Fusi, E., Rebucci, R., Leone, F., Dell’Orto, V., Chiofalo, V. and Pinotti, L., 2017b. Former food products safety evaluation: computer vision as an innovative approach for packaging remnants detection. Journal of Food Quality. 2017, 1–6. 10.1155/2017/1064580
Tuorila, H. and Monteleone, E., 2009. Sensory food science in the changing society: opportunities, needs, and challenges. Trends Food Science Technology 20: 54–62. 10.1016/j.tifs.2008.10.007
Yakubu, H.G., Kovacs, Z., Toth, T. and Bazar, G., 2021. Trends in artificial aroma sensing by means of electronic nose technologies to advance dairy production—a review. Critical Reviews in Food Science and Nutrition. 10.1080/10408398.2021.1945533
Yang, J. and Lee, J., 2019. Application of sensory descriptive analysis and consumer studies to investigate traditional and authentic foods: a review. Foods 8(2): 54. 10.3390/foods8020054