0
Select Articles

Cyber-Physical Security Research at UMBC's Eclipse Lab OPEN ACCESS

[+] Author Notes
CDR Brien Croteau, Deepak Krishnankutty

University of Maryland, Baltimore County

CDR Brien Croteau graduated in 1999 from the U.S. Naval Academy with a B.S. in Systems Engineering, completed a M.S. in Control Systems Engineering from Rensselaer Polytechnic Institute in 2000. He served as a Naval Flight Officer for 16 years flying in the EA-6B Prowler and EA-18G Growler aircraft and attended the U.S. Naval Test Pilot School in 2007. In 2016, he was selected to become a Permanent Military Professor and began a Ph.D. program in Electrical Engineering at University of Maryland Baltimore County. After completing that degree, he will join the Cyber Science department at the U.S. Naval Academy. His research interests are in the nexus between hardware security and higherlevel control systems applications.

Deepak Krishnankutty received the B.Tech. degree in Computer Science and Engineering from the University of Calicut, Kerala, India, in 2006, and the M.Tech. degree in Computer Science and Engineering (Information Security) from the National Institute of Technology Rourkela (NITR), Rourkela, India, in 2009. In 2013, he joined the Department of Computer Engineering, University of Maryland Baltimore County, as a Ph.D. student after a 3 year stint as a lecturer in Kerala, India. His research interest is in the area of hardware security and its countermeasures.

Mechanical Engineering 139(03), S18-S23 (Mar 01, 2017) (6 pages) Paper No: ME-17-MAR7; doi: 10.1115/1.2017-Mar-7

This article focuses on the research and development work at the Eclipse Research Cluster at the University of Maryland, Baltimore County (UMBC) to address cybersecurity challenges. The research team is seeking to address cybersecurity challenges by employing a diverse set of specialty areas including Computer Science, Computer Engineering, and Electrical Engineering. This group is looking at leveraging physical relationships to provide a diversity of measurement and reporting to not only improve anomaly detection, but also make decisions about how to keep critical functions operating even if only in a degraded mode. By exploiting the physical relationships between pressing a brake pedal and the operator’s leg position and the power consumption of a sensor and the instructions being run in it, this group proposes to provide new indicators that can be used to increase resilience to cyberattacks. This concept describes an example for a small section of a typical vehicle system. This group’s future research is seeking to expand this general approach of using the physical relationships of sensors to the properties they are measuring or actuators and the cause or effect of their action.

The Eclipse Research Cluster at University of Maryland, Baltimore County (UMBC) led by professors Nilanjan Banerjee, Chintan Patel, and Ryan Robucci is seeking to address cybersecurity challenges by employing a diverse set of specialty areas including: Computer Science, Computer Engineering, and Electrical Engineering. This group is looking at leveraging physical relationships to provide diversity of measurement and reporting to not only improve anomaly detection but also make decisions about how to keep critical functions operating even if only in a degraded mode.

Cyber-physical domain attacks can occur at multiple levels of a system hierarchy. This group plans to demonstrate solutions at multiple levels that use physical constraints of a plant or system and expand into non-traditional inputs and measurements to provide heterogeneous surfaces which should be more resilient to traditional cyber attacks. Two systems this article will cover include an IC-level side-channel power monitoring system and add-on trusted sensors for automobiles.

Many of the control systems that regulate our critical infrastructure such as power systems or water treatment have back-end controllers located close to the physical entities they are measuring or controlling. Cárdenas et all, breaks a typical control application into three levels. “In the first layer the physical infrastructure is instrumented with sensors and actuators. These field devices are connected via a field area network to programmable logic controllers (PLCs) or remote terminal units (RTUs), which in turn implement local control actions (regulatory control). A control network carries real-time data between process controllers and operator workstations. The workstations are used in area supervisory control, planning the physical infrastructure setpoints. The higher level is the site manufacturing operations, which is in charge of production control, optimizing the process, and keeping a process history.”[1]

The controllers at the lowest level of such a scheme usually contain micro-controller or Field Programmable Gate Array (FPGA) chips that can be reprogrammed to perform maintenance or allow for future upgrades. These also represent a critical vulnerability, in that if an attacker can modify the firmware running on these controllers they can discover much about the rest of the system or possibly launch sophisticated attacks that can be hard to trace back to rogue low-level controllers. One famous example of an attack on a supervisory control and data acquisition (SCADA) system took place against the Maroochy Shire Council's sewage treatment in Queensland, Australia in 2000 where a disgruntled contractor who helped install the system later caused 800,000 liters of raw sewage to spill out into local parks, rivers and even the grounds of a Hyatt Regency hotel, killing marine life and turning creek water black with an unbearable stench [2]. This group of researchers at UMBC propose there may be additional measures that can be taken to protect against some attacks like this.

A critical aspect of a secure system is the identification of system components wherein trust can be established. A common view considers hardware as the workhorse that is driven by software to carry out a specific task (Figure 1a), however in reality software exists within a virtual world running inside a hardware layer (Figure 1b). All contact that software has to the outside physical world has to pass through and be affected by the hardware layer and smart security designers can use this to measure and limit vulnerabilities.

FIGURE 1 Comparison of different views on the relationship between hardware and software: (a) depicts hardware driven by software (b) depicts software running inside the hardware. This group leverages the second view point where hardware forms the interface from software and the outside physical world.

Grahic Jump LocationFIGURE 1 Comparison of different views on the relationship between hardware and software: (a) depicts hardware driven by software (b) depicts software running inside the hardware. This group leverages the second view point where hardware forms the interface from software and the outside physical world.

This suggests security should be grounded in hardware.

Vulnerabilities take on many forms; one way to categorize them is to look at which ones correspond to each layer of abstraction for the electronic computing device being studied. As seen in Figure 2, at the lowest level, the individual integrated circuits can be subjected to side channel attacks which can use outputs that were never intended by the designers to obtain information about information passing through the chips. They can be subjected to malicious hardware modifications (trojan attacks) during the design and manufacturing process. At the higher level, the various hardware components that are part of the entire system are vulnerable to introduction of counterfeit components as well as other supply chain issues as the system might be put together by multiple untrusted entities. At the firmware layer, there is the potential for backdoor vulnerabilities including additional code being inserted or modified during manufacturing or after deployment. At the network layer when several of these devices are connected together, the possibility of some nodes being compromised and needing to counter the injection or flooding of malicious data must be considered.

Many vulnerabilities arise from the increasing globalization of electronics supply chains, wherein components are made, sold, and stored for future use due to current economic realities in the electronic industries. The US Department of Defense has launched an initiative to combat the pervasive growth of counterfeit parts due in large part to how the nature of the electronics manufacturing industry has fragmented with the pressures of a global economy [3]. In 2014 DARPA announced its Supply Chain Hardware Integrity for Electronics Defense (SHIELD) program [4] to invite research into hardware root-of-trust for a cryptographic key enclosed in a physically fragile tiny semiconductor ‘dielet’ anti-tamper component, that will self-destruct upon any attempts to physically remove, modify, or open it [5]. One promising avenue of verification involves the idea of Physical Unclonable Functions (PUF) which is “the exploitation of inherent and naturally occurring physical disorder (fingerprint) of the device as its unique signature, e.g., silicon manufacturing variations. A PUF is a (partially) disordered physical system: when interrogated by a challenge (or input, stimulus), it generates a unique device response (or output).” [6]

FIGURE 2 Spectrum of vulnerabilities listing examples of how attacks differ depending on what level is targeted.

Grahic Jump LocationFIGURE 2 Spectrum of vulnerabilities listing examples of how attacks differ depending on what level is targeted.

FIGURE 3 Concept of separating user-programmable computing functions from a secure processing core which is harder to modify and controls access to power and communications.

Grahic Jump LocationFIGURE 3 Concept of separating user-programmable computing functions from a secure processing core which is harder to modify and controls access to power and communications.

Also of primary concern is the globalization of the design chain. A subset of the steps involved in design and production of integrated circuits include the following: Design, Register-Transfer Level (RTL) conversion, Logic Gates synthesis, Scan Chain insertion, Clock Tree insertion, Place and Route layout, IO Pad design, Reticle assembly, Foundry fabrication, Die slicing, Packaging and Delivery. The various steps in producing integrated circuits may take place in far-ranging companies and locations throughout the world.

One approach to tackling these myriad of security issues facing users of programmable devices is to design with strong security measures from the start. A possible concept in this vein, shown in Figure 3, is to divide the computing functions of a programmable micro-controller/ FPGA into two parts, one that contains the normal code and memory being used and modified regularly and a second that contains security and cryptographic information and would act as a monitor for malicious behavior. The secure core would be harder to re-program and would control access of the main portion to the communication channels to the outside world as well as monitor any information being leaked on side-channels, as described in the next section.

FIGURE 4 Different examples of side channel emissions that can be exploited to gather information about a chip or sensor.

Grahic Jump LocationFIGURE 4 Different examples of side channel emissions that can be exploited to gather information about a chip or sensor.

Side Channel Attacks allow an attacker to extract secret information from a target device by monitoring the power supply, electromagnetic radiation, or timing information of the device. Side-channels are conduits of information (Inputs/ Outputs) not intended by designers that exist in a physical system and increase observability in such a way as to compromise security. Simple Power Analysis (SPA) involves direct interpretation of the power supply traces from the operation of interest. Other examples of side channel attacks include: Timing [8], Electromagnetic (EM) [9], Differential fault [10], Scanbased [11], Cache-based [12], Bus-snooping [13], and Acoustic [14]. Side-channel attacks have been proven to reveal the keys of popular encryption ciphers such as the Data Encryption Standard (DES) since Kocher et. al., published the seminal paper [7] on this topic in 1999. For much more background on this topic please read their 2011 update to that original work published in the Journal of Cryptographic Engineering (JCEN) titled “Introduction to differential power analysis” [8] which has an extensive list of references.

Ironically, though side-channels have become popular security exploits, they can also be leveraged to increase security. The next section demonstrates how power supply measurements are physically bound to IC operation and can be used to identify parameters of code execution for validation. This insight is further extended by identifying other physical side-channels in a typical cyber-physical control system and introduce monitors to observe and analyze them alongside the physical signals they are bound to. A vehicular system is used as an exemplary case study for a cyber-physical control system as presented in the subsequent section.

In the Eclipse lab at UMBC, a custom board (Figure 5) was designed and fabricated to measure power-consumption data during software execution. The board comprises of one control FPGA facilitating experimentation on a second FPGA, a Xilinx Spartan 3E which is the Device Under Test (DUT) instantiated as an openMSP430 [15] running at a clockrate of 10 MHz. The control FPGA also functions as the communication device to convey debug and control data to the DUT. Four power-supply pins available on the DUT are used for power supply side-channel analysis by using 1 Ω-resistors in the supply path for each power pin. The voltage across each resistor due to the current consumption of the device is actively probed and amplified on-board and passed through coax cables to four channels on a Tektronix DPO7354C oscilloscope. From there, the data is collected and sent to a PC to be analyzed in custom MATLAB software.

FIGURE 5 (a) Setup of FPGA side-channel power measuring device. (b) System level block diagram showing the relationship between the device under test and the control FPGA.

Grahic Jump LocationFIGURE 5 (a) Setup of FPGA side-channel power measuring device. (b) System level block diagram showing the relationship between the device under test and the control FPGA.

The openMSP430 instruction set consists of instructions that use 1- to 6-clock-cycles per instruction [16]. A training set of power profile measurements was formed by averaging over 20,000 instances of each instruction. Figure 6 shows the collection of power traces for all the 2-cycle and 4-cycle instructions. One can see that there are some differences between the power traces that can potentially be used to identify which instructions were executed, given a new power measurement.

FIGURE 6 Averaged sampled power traces collected grouping 2-cycle and 4-cycle MSP430 instructions.

Grahic Jump LocationFIGURE 6 Averaged sampled power traces collected grouping 2-cycle and 4-cycle MSP430 instructions.

Each collection of the same clock-cycle instructions was grouped based on hardware utilization, addressing mode (e.g., memory, register), computational operation, and status register updates. For example, all the 2-clock-cycle instructions were broken into three groups: One set involved operations where the source was memory and the destination was a register, second group involved a constant number (immediate memory) being operated on and going to a register while performing a subtraction operation (using the 2's-complement hardware), and the third group was an immediate to register operation without subtraction.

As a first step in matching instructions to a measured power trace, the intent would be to find out what pattern of clock-cycles best approximates the sample. As shown in Figure 7, the sampled waveform could be made up of {4,3,2}, {2,6,1}, or {2,3,4} cycle groups. In general, finding the best match is a problem with NP time complexity and thus it is impractical to calculate an exact solution. As an alternative, the UMBC group developed a dynamic programing scheme that uses nearest-neighbor comparisons against the templates to find candidate sequences that best match the input trace [17]. Once the number of clock-cycles is known for each segment, then they can be compared against the templates to find the group of instructions that best matched the trace. In laboratory tests using the FPGA setup, they have demonstrated 72-100% accuracy of matching the correct instruction group. For the set of groups listed made from all the 2-clock-cycle instructions, classification accuracy of 100%, 95.3% and 79.4% for the three groups can be achieved [18].

One application area which has generated a great deal of public interest is the sensor network in a typical automobile generally connected by the aging controller area network (CAN) bus. In 2014, Senator Ed Markey published a report that described the current threats facing modern vehicles and some responses from manufacturers about what steps they were taking to prevent malicious attacks. [19] Using the security monitor concept one could potentially add additional components between vehicle measuring units to control their access to the bus if malicious or abnormal behavior is identified. A UMBC project that involved installing trusted physical sensors to a vehicle is described below.

This initiative [20] installed some additional trusted sensors to an automobile in order to have another signal to compare CAN bus data against to detect and prevent anomalies or attacks. Textile capacitive sensors [21] were installed near the pedals of test vehicles which by using the proximity of the driver's leg to the sensor would provide a measurement of the physical movement corresponding to depressing the brake or accelerator pedal. An inertial sensor was placed on the steering wheel which measured the movement of the steering inputs. The data from these sensors was compared to similar data on the CAN network to corroborate the authenticity of the packets on the CAN bus. If the CAN network is compromised, these additional trusted sensors can act as an out-of-band detector of anomalies. The researchers envision, in near real time, being able to fuse information from the CAN bus and the external trusted sensors to alert the driver to CAN inconsistencies and possibly command an ability to operate in a degraded mode to maintain safe driving, but prevent further malicious behavior. In any case the inclusion of these additional sensors will make it more difficult for attackers to inject false data into the vehicle.

Figure 8 illustrates a potential configuration of an add-on heterogeneous trusted sensor system in one portion of the vehicular system. The chain links represent physical and cyber interfaces that have bounded relationships. Green arrows represent identified side-channels being analyzed by monitors of security. Yellow arrows represent the aggregate information sent into the security unit. The security unit would enforce limitations on components, such as the implementation of a secure core processor described above, in the event of suspicion of attack. By exploiting the physical relationships between pressing a brake pedal and the operator's leg position and the power consumption of a sensor and the instructions being run in it, this group proposes to provide new indicators that can be used to increase resilience to cyber attacks.

FIGURE 7 Dynamic Programming example. Given an arbitrary waveform one must first map the correct pattern of clock cycle groups before matching individual instructions.

Grahic Jump LocationFIGURE 7 Dynamic Programming example. Given an arbitrary waveform one must first map the correct pattern of clock cycle groups before matching individual instructions.

This concept describes an example for a small section of a typical vehicle system. This group's future research is seeking to expand this general approach of using the physical relationships of sensors to the properties they are measuring or actuators and the cause or effect of their action. Other potential examples include a microphone to determine if a fan is spinning or an inertial sensor to measure the physical deflection of a steering wheel. These inexpensive add-on trusted sensors will communicate to higher level controllers using out-of-band communication paths, and since they are unlike the nominal sensors it would take much more effort to compromise both as part of a malicious attack. The UMBC Eclipse research cluster (https://eclipse.umbc.edu/) is currently collaborating with several leading academic and government institutions, but continue to seek new potential partners.

FIGURE 8 Diagram of trusted add-on sensors in a car application. Exploiting the physical relations and using heterogeneous sensors will increase cyber resiliency.

Grahic Jump LocationFIGURE 8 Diagram of trusted add-on sensors in a car application. Exploiting the physical relations and using heterogeneous sensors will increase cyber resiliency.

This work was supported in part by the U.S. Office of Naval Research under Award N00014-15-1-2179.

Cárdenas, Alvaro A., Saurabh Amin, and Shankar Sastry. “Research challenges for the security of control systems.” Proceedings of the 3rd conference on Hot topics in security. USENIX Association, 2008.
NIST Industrial Control System (ICS) Cyber Security presentation “Maroochy Water Services Case Study” , Aug. 2008, http://csrc.nist.gov/groups/SMA/fisma/ics/documents/Maroochy-Water-Services-Case-Study_briefing.pdf
U. S. Senate Armed Services Committee Report:112-167, “Inquiry into Counterfeit Electronic Parts in the Department of Defense Supply Chain”, 2012, http://www.armed-services.senate.gov/imo/media/doc/Counterfeit-Electronic-Parts.pdf
Bernstein, Kerry. DARPA Broad Agency Announcement DARPA-BAA-14-16 “Supply Chain Hardware Integrity for Electronics Defense (SHIELD).” 2014.
Koushanfar, Farinaz, and Ramesh Karri. “Can the SHIELD protect our integrated circuits?.” In 2014 IEEE 57th International Midwest Symposium on Circuits and Systems (MWSCAS), pp. 350– 353. IEEE, 2014.
Hussain, Siam U., Sudha Yellapantula, Mehrdad Majzoobi, and Farinaz Koushanfar. “BIST-PUF: Online, hardware-based evaluation of physically unclonable circuit identifiers.” In 2014 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), pp. 162– 169. IEEE, 2014.
Kocher, Paul, Joshua Jaffe, and Benjamin Jun. “Differential power analysis.” Annual International Cryptology Conference. Springer Berlin Heidelberg, 1999
Kocher, Paul, et al. “Introduction to differential power analysis.” Journal of Cryptographic Engineering 1.1, 2011: pp. 5-27. http://link.springer.com/content/pdf/10.1007/s13389-011-0006-y.pdf
Quisquater, J.-J., and Samyde, D., “Electromagnetic analysis (ema): Measures and countermeasures for smart cards. In Smart Card Programming and Security,” I. Attali and T. Jensen, Eds., vol. 2140 of Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2001, C pp. 200– 210
Biham, E., and Shamir, A. “Differential fault analysis of secret key cryptosystems.” In Advances in Cryptology - CRYPTO’97, J. Kaliski, Burton, S., Ed., vol. 1294 of Lecture Notes in Computer Science. Springer Berlin Heidelberg, 1997, pp. 513– 525.
Yang, B., Wu, K., and Karri, R. “Scan based side channel attack on dedicated hardware implementations of Data Encryption Standard.” In Proceedings of the IEEE International Test Conference (ITC), Oct. 2004, pp. 339– 344.
Page, D. “Theoretical use of cache memory as a cryptanalytic side-channel.” Technical report CSTR-02-003, Department of Computer Science, University of Bristol, 2002.
Kuhn, M. “Cipher instruction search attack on the bus-encryption security microcontroller ds5002fp.” Computers, IEEE Transactions on 47, 10, Oct 1998 pp. 1153– 1157.
Asonov, D., and Agrawal, R. “Keyboard acoustic emanations.” In 2004 IEEE Symposium on Security and Privacy (S&P 2004), 9-12 May 2004, Berkeley, CA, USA, 2004, pp. 3– 11.
“MSP430x1xx Family User's Guide,” (rev. f) ed., Texas Instruments, http://www.ti.com/lit/ug/slau049f/slau049f.pdf, 2006
O. Girard, “openmsp430,” http://opencores.org/project,openmsp430, 2016 (accessed March 1, 2016).
N. S. Altman. “An introduction to kernel and nearest-neighbor nonparametric regression.” The American Statistician, 46 (3): 175– 185, August 1992.
D. Krishnankutty, R. Robucci, C. Patel and N. Banerjee (in press). “FISCAL : Firmware Identification using Side-Channel Power Analysis”, VLSI Test Symposium 2017
Report by staff of U.S. Senator Edward Markey “Tracking & Hacking: Security & Privacy Gaps Put American Drivers at Risk”, 2014, https://www.markey.senate.gov/imo/media/doc/2015-02-06_MarkeyReport-Tracking_Hacking_CarSecurity%202.pdf
Kulandaivel, S., Schmandt, J., Fertig M, Banerjee, N. Preseentation “Detection and Mitigation of Anomalous Behavior in Embedded Automotive Networks”, 2016, http://ur.umbc.edu/files/2016/06/kulandaivelSekarSm.pdf
G. Singh, T. A. Chen, R. Robucci, C. Patel and N. Banerjee, “distratto: Impaired Driving Detection Using Textile Sensors,” in IEEE Sensors Journal, vol. 16, no. 8, April15, 2016, pp. 2666– 2673.
Copyright © 2017 by ASME
View article in PDF format.

References

Cárdenas, Alvaro A., Saurabh Amin, and Shankar Sastry. “Research challenges for the security of control systems.” Proceedings of the 3rd conference on Hot topics in security. USENIX Association, 2008.
NIST Industrial Control System (ICS) Cyber Security presentation “Maroochy Water Services Case Study” , Aug. 2008, http://csrc.nist.gov/groups/SMA/fisma/ics/documents/Maroochy-Water-Services-Case-Study_briefing.pdf
U. S. Senate Armed Services Committee Report:112-167, “Inquiry into Counterfeit Electronic Parts in the Department of Defense Supply Chain”, 2012, http://www.armed-services.senate.gov/imo/media/doc/Counterfeit-Electronic-Parts.pdf
Bernstein, Kerry. DARPA Broad Agency Announcement DARPA-BAA-14-16 “Supply Chain Hardware Integrity for Electronics Defense (SHIELD).” 2014.
Koushanfar, Farinaz, and Ramesh Karri. “Can the SHIELD protect our integrated circuits?.” In 2014 IEEE 57th International Midwest Symposium on Circuits and Systems (MWSCAS), pp. 350– 353. IEEE, 2014.
Hussain, Siam U., Sudha Yellapantula, Mehrdad Majzoobi, and Farinaz Koushanfar. “BIST-PUF: Online, hardware-based evaluation of physically unclonable circuit identifiers.” In 2014 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), pp. 162– 169. IEEE, 2014.
Kocher, Paul, Joshua Jaffe, and Benjamin Jun. “Differential power analysis.” Annual International Cryptology Conference. Springer Berlin Heidelberg, 1999
Kocher, Paul, et al. “Introduction to differential power analysis.” Journal of Cryptographic Engineering 1.1, 2011: pp. 5-27. http://link.springer.com/content/pdf/10.1007/s13389-011-0006-y.pdf
Quisquater, J.-J., and Samyde, D., “Electromagnetic analysis (ema): Measures and countermeasures for smart cards. In Smart Card Programming and Security,” I. Attali and T. Jensen, Eds., vol. 2140 of Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2001, C pp. 200– 210
Biham, E., and Shamir, A. “Differential fault analysis of secret key cryptosystems.” In Advances in Cryptology - CRYPTO’97, J. Kaliski, Burton, S., Ed., vol. 1294 of Lecture Notes in Computer Science. Springer Berlin Heidelberg, 1997, pp. 513– 525.
Yang, B., Wu, K., and Karri, R. “Scan based side channel attack on dedicated hardware implementations of Data Encryption Standard.” In Proceedings of the IEEE International Test Conference (ITC), Oct. 2004, pp. 339– 344.
Page, D. “Theoretical use of cache memory as a cryptanalytic side-channel.” Technical report CSTR-02-003, Department of Computer Science, University of Bristol, 2002.
Kuhn, M. “Cipher instruction search attack on the bus-encryption security microcontroller ds5002fp.” Computers, IEEE Transactions on 47, 10, Oct 1998 pp. 1153– 1157.
Asonov, D., and Agrawal, R. “Keyboard acoustic emanations.” In 2004 IEEE Symposium on Security and Privacy (S&P 2004), 9-12 May 2004, Berkeley, CA, USA, 2004, pp. 3– 11.
“MSP430x1xx Family User's Guide,” (rev. f) ed., Texas Instruments, http://www.ti.com/lit/ug/slau049f/slau049f.pdf, 2006
O. Girard, “openmsp430,” http://opencores.org/project,openmsp430, 2016 (accessed March 1, 2016).
N. S. Altman. “An introduction to kernel and nearest-neighbor nonparametric regression.” The American Statistician, 46 (3): 175– 185, August 1992.
D. Krishnankutty, R. Robucci, C. Patel and N. Banerjee (in press). “FISCAL : Firmware Identification using Side-Channel Power Analysis”, VLSI Test Symposium 2017
Report by staff of U.S. Senator Edward Markey “Tracking & Hacking: Security & Privacy Gaps Put American Drivers at Risk”, 2014, https://www.markey.senate.gov/imo/media/doc/2015-02-06_MarkeyReport-Tracking_Hacking_CarSecurity%202.pdf
Kulandaivel, S., Schmandt, J., Fertig M, Banerjee, N. Preseentation “Detection and Mitigation of Anomalous Behavior in Embedded Automotive Networks”, 2016, http://ur.umbc.edu/files/2016/06/kulandaivelSekarSm.pdf
G. Singh, T. A. Chen, R. Robucci, C. Patel and N. Banerjee, “distratto: Impaired Driving Detection Using Textile Sensors,” in IEEE Sensors Journal, vol. 16, no. 8, April15, 2016, pp. 2666– 2673.

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In