[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20090240951A1 - System security manager - Google Patents

System security manager Download PDF

Info

Publication number
US20090240951A1
US20090240951A1 US12/366,600 US36660009A US2009240951A1 US 20090240951 A1 US20090240951 A1 US 20090240951A1 US 36660009 A US36660009 A US 36660009A US 2009240951 A1 US2009240951 A1 US 2009240951A1
Authority
US
United States
Prior art keywords
fplc
recited
image
cryptographic processing
cryptographic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/366,600
Inventor
John R. Owens
John C. Andolina
Stuart Shanken
Richard L. Quintana
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Viasat Inc
Original Assignee
Viasat Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Viasat Inc filed Critical Viasat Inc
Priority to US12/366,600 priority Critical patent/US20090240951A1/en
Assigned to VIASAT, INC. reassignment VIASAT, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ANDOLINA, JOHN C., OWENS, JOHN R, SHANKEN, STUART, QUINTANA, RICHARD L.
Publication of US20090240951A1 publication Critical patent/US20090240951A1/en
Assigned to UNION BANK, N.A. reassignment UNION BANK, N.A. SECURITY AGREEMENT Assignors: VIASAT, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • G06F21/575Secure boot
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/71Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
    • G06F21/76Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information in application-specific integrated circuits [ASIC] or field-programmable devices, e.g. field-programmable gate arrays [FPGA] or programmable logic devices [PLD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2143Clearing memory, e.g. to prevent the data from being stolen

Definitions

  • This disclosure relates in general to field-programmable logic chip or circuit (FPLC) and, but not by way of limitation, to FPLC used in traffic processing such as cryptographic processing.
  • FPLC field-programmable logic chip or circuit
  • Programmable circuitry is common in logic design, but achieving the level of security and high-assurance desired by the governments, individuals and certain businesses has been difficult. Programmability is seen as a risk to achieving security and high-assurance. This is especially true when programmability is utilized in the field. One could imagine the programmability feature being used to cause the FPLC to operate in a mode that would not have the required security.
  • Cryptographic circuitry can fail or be compromised. Where such a circuit enters an error mode, there is no recovery. Failure can result in insecure functioning that is not desirable.
  • the cryptographic circuit can erase keys to prevent further activity. Even without keys, the cryptographic circuit can perform in undesirable modes when malfunctioning. With programmability, the risk of these malfunctions is greater.
  • FPLCs have many advantages over fixed circuitry, but cost is generally not one of the advantages.
  • Various images and soft cores are loaded into FPLCs.
  • the size of the FPLC is chosen to accommodate all the images needed for a given design. When smaller or fewer FPLCs are possible, the costs of producing a unit decreases.
  • a method for securing a field-programmable logic chip or circuit is disclosed.
  • Information is cryptographically processed within the FPLC.
  • An error condition is detected outside of the FPLC and the error condition is communicated to the FPLC to disrupt an image(s) within the FPLC.
  • at least a portion of a key can be erased such that cryptographic processing is curtailed or eliminated.
  • the present disclosure provides a method for providing system security for a FPLC.
  • the method includes cryptographically processing information within the FPLC and detecting an error condition.
  • the detecting operation is performed external to the FPLC.
  • the method also includes conveying into the FPLC the error condition.
  • the method also includes disrupting an image in the FPLC for one or more soft cores loaded when the error condition is detected where the image cannot perform cryptographic processing after disruption even if keys are present.
  • the present disclosure provides a cryptographic processing system for providing system security for a FPLC.
  • the cryptographic processing system includes a cryptographic soft core at least partially within the FPLC that cryptographically processes information.
  • the cryptographic processing system also includes a system security manager that detects an error condition. The detection is performed external to the FPLC, the cryptographic soft core in the FPLC is disrupted when the error condition is detected and the cryptographic soft core cannot perform cryptographic processing after disruption even if keys are present.
  • the present disclosure provides a cryptographic processing system for providing system security for a FPLC.
  • the cryptographic processing system includes a means for cryptographically processing information within the FPLC.
  • the system also includes a means for detecting an error condition. The detection is performed external to the FPLC.
  • the system also includes a means for disrupting an image in the FPLC for one or more soft cores loaded when the error condition is detected. The image cannot perform cryptographic processing after disruption even if keys are present.
  • FIGS. 1A , 1 B and 1 C depict block diagrams of embodiments of a cryptographic processor system
  • FIG. 2A depicts a block diagram of an embodiment of a system security manager (SSM);
  • SSM system security manager
  • FIG. 2B depicts a block diagram of an embodiment of a fail-safe SSM
  • FIG. 3 illustrates a flowchart of an embodiment of a process for booting a cryptographic processor system
  • FIGS. 4A , 4 B, 4 C and 4 D depict block diagrams of embodiments of a traffic processing system
  • FIGS. 5A , 5 B, 5 C, 5 D, 5 E, 5 F, 5 G, and 5 H depict diagrams of embodiments of a layout of a field-programmable logic chip (FPLC) implementing a traffic processor;
  • FPLC field-programmable logic chip
  • FIGS. 6A and 6B depict diagrams of embodiments of a state machine used to control the traffic processor.
  • FIG. 7 illustrates a flowchart of an embodiment of a process for cryptographically processing information in a two state configuration.
  • FIG. 1A a block diagram of an embodiment of a cryptographic processor system 100 - 1 is shown.
  • FPLCs field-programmable logic chips or circuits
  • FPGA field-programmable gate arrays
  • PLDs programmable logic devices
  • CPLDs complex PLDs
  • Programmability in the field includes programmability while manufacturing a system including the FPLC or programmability while the system is deployed with an end user.
  • a FPLC is a circuit chip or die in it's own package or chips or dice in a multi-chip module.
  • a programmable logic image is a soft core of functionality that can be programmed into a FPLC or otherwise implemented.
  • the PLI could include general purpose processor, a state machine, an application specific processor, a cryptofunction, and/or configuration information and parameters.
  • a number of PLIs may be in a single FPLC or a single PLI may be spread out among a number of FPLCs.
  • the blocks shown in the figures can be combined or split in various embodiments.
  • a number of PLIs are used to process traffic or more specifically, cryptographically process traffic.
  • Plain text information is received by the interface PLI 140 for encryption processing, and cipher text information is output by the interface PLI 140 .
  • cipher text information is received by the interface PLI 140 for decryption processing, and plain text information is output by the interface PLI 140 .
  • the interface PLI 140 can pass information without cryptographic processing in some cases.
  • the three PLIs 104 , 124 , 140 in this embodiment could be implemented in one or two FPLCs with the first holding the initiator PLI, the second holding the interface PLI and the crypto PLI divided between the two FPLCs.
  • a crypto PLI 124 performs cryptographic processing in a traffic processing state. These are just representative blocks for performing cryptographic processing and could be combined or separated in various embodiments. If loaded into the same FPLC, PLIs can be isolated from each other with a buffer of unused cells and controlled interfaces between the isolated areas. Signals from one PLI can be kept from routing outside the isolated area except where a deliberate port is configured to pass information between PLIs. In this way, isolation can be achieved in the same device unless interaction through a port is desired.
  • Soft cores for the various PLIs in their various versions are held in a storage flash 108 . These soft cores are sometimes referred to as images. Additionally, key fragments or layers can be held in the storage flash 108 .
  • the storage flash 108 can additionally hold software to boot and run any processor of the cryptographic processor system 100 . Any type of flash memory or non-volatile can be used for the storage flash 108 .
  • This embodiment also includes volatile memory 132 .
  • a processing core within the initiator PLI 104 loads software from the storage flash 108 and uses the volatile memory 132 for program operation and variable storage.
  • RAM, SRAM or any type of volatile memory could be used for the volatile memory 132 .
  • Other embodiments could use non-volatile memory, for example, MRAM for the volatile memory 132 .
  • a battery or other power source (not shown) is used to allow a battery-backed memory 106 to retain its contents even when main power is interrupted or lost.
  • SRAM or DRAM can be used for the battery-backed memory.
  • the battery-backed memory 106 may also hold key layers. In one embodiment, some key layers are stored in the storage flash 108 while others are stored on the battery-backed memory 106 . Further, key layers can be held on a token that is removably coupled to the cryptographic processor system 100 through a token interface. Any type of non-volatile memory can be used for the storage flash 108 , the battery-backed memory 106 or token.
  • the program load circuit 112 loads soft cores for the PLIs into one or more FPLCs.
  • the program load circuit 112 could be implemented with a CPLD, for example.
  • the various images or soft cores in the storage flash 108 are loaded into the programmable logic in a particular sequence that can be controlled by the program load circuit 112 and/or the loaded PLIs. Certain PLIs can be loaded into the programmable logic and later removed such that other PLIs can recover some of the programmable logic when the removed PLIs are not needed.
  • An initiator PLI 104 may assist in this process and perform other configuration once the soft core of the initiator PLI 104 is loaded and functioning. Certain configuration actions such as loading keys, built-in test and other housekeeping functions can be performed by the initiator PLI 104 . In some embodiments, the soft core of the initiator PLI 104 may be removed if not needed and reloaded when it is needed. Memory can be used to pass information from the initiator PLI 104 to other PLIs to use when the initiator PLI is not loaded.
  • system security manager (SSM) 116 monitors for errors and alarms before taking remedial action on the PLIs, FPLCs and/or key layers. Redundancy can be used by the SSM 116 to operate in a failsafe, trusted and/or high-assurance manner.
  • FIG. 1B shows a block diagram of another embodiment of the cryptographic processor system 100 - 2 that uses a fail-safe SSM 118 instead of a SSM 116 .
  • the fail-safe SSM 118 includes redundancy and/or other high-assurance circuits as further described below.
  • This embodiment also includes a processor chip 128 instead or in addition to a processing soft core within the initiator PLI 104 .
  • the processor chip 128 is a hardware processor separate from the FPLC holding the initiator PLI 104 .
  • Volatile memory 132 is used by the processor chip 128 for program operation and variable storage.
  • FIG. 1C a block diagram of yet another embodiment of a cryptographic processor system 100 - 3 is shown.
  • This embodiment includes a security manager PLI 136 that is embedded in a FPLC. Some or all of the FPLCs used in the cryptographic processor system 100 could have its own security manager PLI 136 .
  • the security manager PLI could be used in addition to a SSM 116 or a fail-safe SSM 118 in various embodiments.
  • the security manager PLI 136 is further described below.
  • FIG. 3 a flow chart of an embodiment of a process 300 for booting a cryptographic processor system 100 is shown.
  • the depicted portion of the process begins in block 304 where the storage flash 108 is loaded with one or more default images.
  • Some FPLCs have a decryption algorithm built into the chip to be programmed that uses a fixed key to allow decryption of images loaded into that chip.
  • XilinxTM and AlteraTM could provide on-chip advanced encryption standard (AES) decryption of images using a predetermined key that is fixed and battery-backed on the chip.
  • Block 304 would include encrypting the default image(s) with the appropriate key prior to storage in the storage flash 108 .
  • the default images provide just enough logic to get the cryptographic processor system 100 running in a configuration state, but not enough to allow data throughput in this embodiment.
  • the initiator PLI 104 can be a default image in some embodiments.
  • the default images support built-in-test to allow checking that the logic circuitry at least has some functionality and that the circuit card was assembled properly. Other embodiments could do certain unclassified traffic processing with the default images.
  • a loaded default image allows running and updating of the operational software.
  • the default images include another decryption algorithm that is unclassified. This unclassified decryption algorithm is a soft core that can be loaded into FPLC to allow decrypt and load of additional logic in the same chip or another chip.
  • protected images are loaded into the storage flash 108 . These may be optionally encrypted to allow decoding with the on-chip decryption algorithm.
  • the protected images are encrypted to allow decryption with the unclassified decryption algorithm and an appropriate key.
  • the protected images allow full cryptographic processing when properly loaded and enabled with the appropriate key(s).
  • a particular portion of a FPLC or a whole programmable FPLC could start out with a default image that is later replaced wholly or in part with another protected or unprotected image. Other embodiments could keep some or all of the default image functioning alongside protected images.
  • Block 316 Operation of the cryptographic processor system 100 is begun in block 316 .
  • Prior blocks 304 , 308 and 312 could be performed at the factory in one embodiment.
  • Block 312 is likely to be done at least partially in the field as layers of the multi-layered key may change over time or be erased.
  • Booting may be begun by application of power to the cryptographic processor system 100 or by a reset operation or other remedial action.
  • One or more default images are retrieved from the image flash storage 108 and loaded into programmable logic by the program load circuit 112 in block 320 .
  • the initiator PLI 104 could be loaded as a default image.
  • the default image(s) may be encrypted and a previously-loaded default image, such as the initiator PLI 104 , could automatically decrypt the default image before loading into the programmable logic.
  • Other embodiments may use the cryptographic function built into some FPLCs as explained above. Yet other embodiments could use the built-in cryptographic function and a soft core cryptographic function to utilize double decryption.
  • Embodiments may have various blocks implemented in one or more PLIs to separate different functions within a given FPLC.
  • the initiator PLI 104 has the default image loaded in unencrypted form, a general purpose processor is available as a soft core that was part of the default image. Other embodiments could use a hard core for the processor used by the initiator PLI and external to any FPLC. Memory is available for processing and retrieval of operational software that is booted in block 324 . Once the default software is loaded there is enough intelligence to accept new software and/or keys, but no cryptographic processing of traffic can be done in this embodiment of the default software. In this embodiment, the initiator PLI 104 has the unclassified decryption algorithm as a soft core or available in software. The PLIs operate with default images in block 324 .
  • the initiator PLI retrieves the remainder of the default images in block 326 . Those default images are loaded using optional decryption before built-in-test is performed on each of the other PLIs. During the loading process, verification of checksums, CRCs or hashes can be performed to confirm the default images were loaded correctly. The initiator PLI 104 can reload the default images where there is an error in the checksum, CRC or hash.
  • a determination in block 328 analyzes whether there are multi-layer keys present to continue booting the cryptographic processor system 100 .
  • the multi-layer key is reconstituted by retrieving various layers or portions from different locations. For example, the storage flash 108 , the token and/or the battery-backed memory 106 may all have a portion of the multi-level key. Where one or more layers of the multi-layer key is missing processing loops back to block 312 to wait for loading of a missing key layer(s). If the multi-layer key is present, processing goes from block 328 to 332 , but goes from block 328 to block 312 if the multi-layer key is missing.
  • the multi-layered key(s) are loaded into the cryptographic processor system 100 in block 312 .
  • the multi-layered key has multiple portions that are all needed by the unclassified decryption algorithm to decrypt the protected images. Layers of the multi-layered key(s) can be erased if error conditions are found. Without all of the layers, the multi-layered key will not be usable.
  • a battery-backed memory 106 and/or flash may be used to store the various layers of the multi-layered key. This embodiment stores various layers or values of the keys in the storage flash 108 , the battery-backed memory 106 and/or a token coupled to a token interface.
  • the layers may additionally be encrypted, for example, the layer retrieved from the storage flash 108 is encrypted in one embodiment.
  • Some locations may store multiple layers or portions of the multi-layer key, for example, the battery-backed memory 106 could have two portions that can be individually erased or scrambled to destroy the multi-layer key. The condition for destroying each layer of the key could be different to protect against different threats.
  • the initiator PLI 104 loads a soft core version of the unclassified decryption algorithm.
  • the unclassified decryption algorithm could be AES, triple-DES or any other appropriate algorithm.
  • the multi-layer key is constituted by retrieving layers from two or more locations.
  • the protected soft cores are retrieved from the storage flash 108 and decrypted by the initiator PLI 104 in block 336 .
  • Some embodiments may also use the AES encryption built into the FPLC to utilize double decryption. That second order of protection of the AES encryption would involve decryption again as the protected image is loaded into a particular FPLC.
  • the initiator PLI 104 may calculate a checksum or hash as an soft core is loaded. The initiator PLI 104 can compare the checksum or hash against a predetermined value to confirm the soft core was loaded correctly. The initiator PLI 104 can reload the protected image where there is an error in the checksum or hash.
  • the PLIs loaded with their default or protected images can be tested various built in tests (BIT) in block 338 .
  • BIT built in tests
  • known result verifications could be performed where a known input is fed to one or more PLIs to determine if a known output is produced during a known answer verification.
  • Other possible testing of the PLIs can be performed such as scan chain testing, check words, check sums, boundary scan, integrity test, and other tests.
  • the cryptographic processor system 100 is in a trusted state. Beyond testing, the trusted state is implemented with redundancy in this embodiment.
  • Monitoring is performed in block 344 for error conditions after the known answer and other verifications. Where they are conditions that could indicate a security concern, one or more layers of the multi-layer key are deleted in block 348 before looping back to block 312 to wait for a new key layer load, which may be manually or automatically performed. From block 312 , the booting process could loop back to either block 316 or block 338 depending on the severity of the error. With other errors detected in block 344 , processing goes from block 344 to block 338 to test the PLIs again without removal of one or more key layers. In the absence of errors detected in block 344 , the cryptographic processor system 100 is available to encrypt and decrypt traffic in a fully operational mode in block 340 . Testing continues in block 344 during operation either periodically, upon certain events or when errors are suspected.
  • FPGAs and/or FPLCs may have security manager PLI 136 for programming into the FPLC.
  • security manager PLI 136 is a soft core in this embodiment. Periodically, the security manager PLI 136 can check the other soft cores loaded into the FPLC to confirm they match what was originally loaded. Changes in the loaded programming could be detected in this manner. When an error condition is detected, the security monitor can erase the programming from within the FPGA or FPLC such that it returned to an inoperable state.
  • the programmed-in security manager PLI 136 is not redundant. Additionally, these approaches presume that the security manager PLI 136 is operating properly despite other problems detected within the FPLC.
  • a SSM 116 is shown.
  • a SSM 116 that has circuitry external to the FPLCs housing the PLIs.
  • the SSM 116 can activate the internal security monitoring circuit to erase the programming in a FPLC and/or may just overwrite the programming, reset the logic or otherwise prevent further operation of the FPLC.
  • the SSM 116 can work in conjunction with the security monitor or replace the function of the security monitor.
  • the SSM 116 is outside the FPLC used for other PLIs, but could be implemented in an ASIC, FPGA, CPLD, or PLD.
  • a CPLD is used to implement the SSM 116 and is not field reprogrammable.
  • Other soft cores or PLIs could be included in the FPLC used by the SSM 116 in other embodiments.
  • the SSM 116 can erase/overwrite/reset PLIs and/or FPLCs, keys, and/or key portions or layers.
  • the system security monitor 116 may receive an indication that a particular FPGA security monitor found a single point failure and erased the FPGA.
  • the system security monitor 116 could respond by writing an erasing or initialization program into the FPGA before reprogramming it once again. Certain conditions only result in erasing and/or reprogramming a portion of a PLI, a whole PLI, multiple PLIs, a FPLC, or multiple FPLCs.
  • This embodiment has a number of security functions that are activated based upon how the inputs are interpreted by a threat analysis circuit 216 .
  • the threat analysis circuit 216 can activate an erase circuit 204 , a PLI wipe circuit 212 and/or a re-fill and test PLI (RATP) circuit 208 .
  • the erase key circuit 204 could erase, overwrite and/or otherwise disable a key. A layer of a multi-layer key may be erased or overwritten to effectively disable use of the multi-layer key even though other layers of the multi-layer key are still available. Where keys should be disabled, for example, the erase key circuit 204 can erase layers in the battery-backed memory 106 , the token and/or the storage flash 108 .
  • the testing of the PLI in the RATP circuit 208 can be built-in-test, security monitor tests, known answer verifications or the like. Re-filling of the PLI can be of the original image, a default image or a null image. Although this embodiment of the RATP circuit 208 and PLI wipe circuit 212 operate at the PLI level, other embodiments could also optionally operate on one or more FPLCs.
  • FIG. 2B an embodiment of a fail-safe SSM 118 is shown that uses redundant SSMs 116 .
  • the embodiment of the SSM 116 in FIG. 2A does not operate in a failsafe mode that is trusted according to some criterions.
  • This embodiment duplicates of all the circuitry in the SSM 116 such that both copies would have to perform the same way or that would cause an alarm condition. In this way, any failure of one of the SSMs 116 would cause an erasure of the various PLIs, FPLCs, key layers, and/or keys.
  • a consistency check circuit 216 compares inputs and outputs of the parallel SSMs 116 to assure there are matches. If one SSM 116 goes haywire, the PLI containing the fail-safe SSM 118 can be erased after possibly erasing one or more keys or key layers.
  • FIG. 4A a block diagram of an embodiment of a traffic processing system 400 - 1 is shown.
  • the traffic processing could be cryptographic or other processing on data.
  • This embodiment processes data in a manner that can tolerate delays associated with switching into a configuration state when necessary to perform configuration and any key management before switching back to a traffic processing state to operate upon more data.
  • the traffic processing system 400 has a program load circuit 112 that loads multiple soft cores into the traffic processor 404 from the storage flash 108 .
  • the program load circuit loads soft cores as a function of the operational state of the traffic processor 404 .
  • a traffic processing soft core 424 includes a traffic processing soft core 424 , a configuration processing soft core 416 , a persistent soft core 408 , traffic ports 418 , a program memory 412 , and a configuration information store 420 .
  • Soft cores are outlined in the figure with dashed lines and are loaded into a FPLC as images by the program load circuit 112 .
  • the program memory 412 holds software for execution by the configuration processing soft core 416 .
  • the software can be loaded by the program load circuit 112 .
  • One or more storage media within the FPLC that implements the traffic processor are used for the program memory and the configuration information store 420 in this embodiment.
  • the program load circuit 112 has pointers to know where the various images are loaded in the storage flash 108 for the various states. The next state is communicated to the program load circuit 112 and the pointer is found to know which addresses from the storage flash 108 to feed into the traffic processor FPLC 404 . The stream of programming information is fed from the storage flash 108 into the programming interface of the FPLC by the program load circuit.
  • the configuration processing soft core 416 performs configuration for the other states of operation, for example, key loading and management, decryption of classified images, built-in test, etc.
  • the configuration processing soft core 416 includes a processor, but other embodiments could perform the same actions without use of a processor.
  • the produced configuration information is recorded in the configuration information store 420 and includes various things such as decoded keys, operational parameters, cryptographic algorithm variables, data ports to use, configuration of data passed to/from the traffic processing soft core 424 .
  • the persistent soft core 408 could be used for loading PLIs or soft cores, optionally decrypting PLIs or soft cores, managing keys and security, and/or a state machine for flipping between the various cores used by the various states.
  • the persistent soft core 408 aids in loading images and storing semaphores or parameters that are passed between states.
  • the state machine for flipping between various cores could be external to the traffic processor 404 in some embodiments. Other embodiments could have transitions to another state decided by a loaded PLI or soft core. For example, when the traffic processing soft core 404 detected an error that required reset, it could trigger loading the configuration processing soft core 416 and pass it the appropriate parameters.
  • the state machine and image loading logic can be within the FPLCs to allow the state machine to remain during reprogramming of the FPLC.
  • Other embodiments could have the image loading logic external when the FPLC does not support partial reconfiguration.
  • the reconfiguration of the entire FPLC could triggered by a state machine that would be overwritten in the process of reconfiguring.
  • the new configuration could have a new state machine capable of triggering a transition to another state that would use the external image loading logic to load a new image into the FPLC.
  • the traffic processing soft core 424 and traffic ports 418 can be loaded by the program load circuit and consume some of the same resources in the device that were consumed by the configuration processing soft core 416 .
  • the traffic processing soft core 424 does not use any software and instead is controlled by state engines in this embodiment.
  • Information needed for the traffic processing soft core 424 are available from the configuration information store 420 in the traffic processing state.
  • the traffic ports 418 are used by the traffic processing soft core 424 to send and receive information with the traffic processor 404 .
  • Other embodiments could avoid use of general-purpose processors for any state and rely on state machines for each state instead.
  • traffic processor 404 If additional configuration is needed at some point, data processing temporarily ceases and the traffic processor 404 returns to the configuration state. Data for processing could be buffered or otherwise delayed until return to the traffic processing state once the configuration state completes.
  • the traffic processor 404 flops between states and the program load circuit 112 loads the necessary soft cores to allow operation with a reduction in resources for the device.
  • FIG. 4B a block diagram of another embodiment of the traffic processing system 400 - 2 is shown.
  • This embodiment uses a volatile memory 132 external to the FPLC implementing the traffic processor 404 to store the program memory 412 and the configuration information store 420 .
  • a memory interface (not shown) can be another soft core that is used to interact with the volatile memory 132 .
  • FIG. 4C a block diagram of yet another embodiment of the traffic processing system 400 - 3 is shown.
  • This embodiment of the traffic processing system 400 - 3 has an initiator PLI 104 as its configuration processing soft core 416 . Additionally, the traffic processing soft core 424 is replaced with the crypto PLI 124 . The functionality of the initiator PLI 104 and the traffic processing soft core 424 is described above.
  • FIG. 4D a block diagram of still another embodiment of the traffic processing system 400 - 4 is shown.
  • the configuration information store 420 is retained within the FPLC of the traffic processor 404 .
  • the FPLC could have embedded memory or could use a soft core to implement memory to hold the configuration information between states.
  • FIGS. 5A & 5B diagrams of an embodiment of a layout of a traffic processor 404 is shown in two different states.
  • This embodiment of the traffic processor includes memory within the FPLC.
  • the memory is used to store the software memory 412 and the configuration information 420 .
  • FIG. 5A shows the loaded soft cores used in a configuration state
  • FIG. 5B shows the soft cores loaded in a traffic processing state for this embodiment.
  • the configuration state in FIG. 5A has a persistent soft core 408 , a configuration processing soft core 416 in addition to the onboard memory.
  • the configuration processing soft core 416 is overwritten with a traffic processing soft core 424 and traffic ports 418 . Traffic ports 418 are not used in the configuration state 604 as these traffic ports are used to pass traffic.
  • FIGS. 5C & 5D diagrams of another embodiment of a layout of a traffic processor 404 is shown in two different states.
  • the layout of the soft cores in a configuration state is shown FIG. 5C
  • the layout of the soft cores in a traffic processing state is shown in FIG. 5D .
  • This embodiment uses an external battery-backed memory 106 .
  • a memory interface soft core 508 is present in both states to interface to the battery-backed memory 106 . Processing can flip back and forth between the two states as necessary during normal operation. Certain soft cores partially or wholly overlap for different states.
  • FIGS. 5E & 5F diagrams of one embodiment of a layout of a traffic processor 404 is shown in two different states.
  • This traffic processor 404 performs cryptographic processing, for example, using the cryptographic processor system 100 .
  • an initiator PLI 104 is used to perform configuration, key management, etc.
  • the crypto PLI 124 and interface PLI 140 are loaded in a second state in a manner that overlaps partially or wholly the initiator PLI 104 . Operation can move between these states during normal operation of the traffic processor 404 .
  • FIGS. 5G & 5H diagrams of another embodiment of a layout of a traffic processor 404 is shown in two different states.
  • This traffic processor 404 performs cryptographic processing, for example, using the cryptographic processor system 100 . Operation moves between at least two states, with one having an initiator PLI 104 loaded and the other having a crypto PLI 124 loaded. In this embodiment, there is no overlap between the PLIs swapped out between states. While in one state there are one or more PLIs that are inoperable unless there is a transition to the state that would use that PLI.
  • FIG. 6A an embodiment of a state machine 600 - 1 used to control the traffic processor 404 is shown.
  • This embodiment operates in two states, namely, a configuration state 604 and a traffic processing state 608 .
  • a configuration state 604 After reset or boot, processing enters the configuration state 604 .
  • configuration state 604 Once configuration is complete, processing goes to the traffic processing state 608 .
  • traffic processing state 608 When more configuration is needed, processing goes from the traffic processing state 608 to the configuration state 604 before returning back to the traffic processing state 608 .
  • Appropriate images are loaded into the device when switching between states.
  • the size of the FPLC is dictated by the largest amount of logic in any state.
  • FIG. 6B another embodiment of the state machine 600 - 2 used to control the traffic processor 404 is shown.
  • This embodiment controls a cryptographic processor system 100 in three states.
  • a control and key management state 612 and a cryptographic processor setup state 616 use a configuration processing soft core 416 , but a cryptographic processing state 620 has no need for a general-purpose processor. Between each state, different soft cores can be loaded into the cryptographic processor system 100 .
  • the control and key management state 612 and a cryptographic processor setup state 616 may use the same soft cores such that loading images between states is not required.
  • FIG. 7 an embodiment of a flow diagram showing a process 700 for cryptographically processing information in a two state configuration.
  • the depicted portion of the process begins in block 704 where the storage flash 108 is loaded with configuration images used for the various soft cores.
  • the storage flash 108 is loaded with the traffic state images used in the various soft cores for that state.
  • software for a general purpose processing core can be loaded into the storage flash 108 or directly into the program memory 412 .
  • Normal operation of the cryptographic processor system 100 begins in block 712 when the PLI or FPLC is reset or powered-up.
  • the configuration image(s) are loaded into the device in block 716 .
  • the traffic processor 404 begins operation in the configuration state to initialize the PLIs in block 720 to begin the configuration state and assembles configuration information for use in other state(s) in block 724 .
  • the configuration information is stored in the configuration information store 420 in block 728 .
  • Transitioning into block 732 activates the traffic processing state by loading traffic processing soft core(s) into the traffic processor 404 .
  • Loading of new soft cores may be preceded by overwriting and/or erasing of prior soft cores or may simply be accomplished by writing the new soft cores over the prior soft cores.
  • the traffic image is configured with the stored traffic configuration information upon activating the traffic processing soft core in block 736 .
  • Processing traffic takes place in block 740 .
  • Other embodiments could perform cryptographic processing in block 740 , for example, using the cryptographic processor system 100 . So long as configuration is not needed or errors are not detected in block 744 , traffic processing continues.
  • Any error detected in block 744 is recorded in block 748 before looping back to block 712 where the device is rebooted.
  • Other embodiments could take any number of remedial measures depending on the error encountered, for example keys or a key layer(s) could be destroyed in a cryptographic application.
  • Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof.
  • the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), soft core processors, hard core processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field-programmable gate arrays
  • Soft core processors hard core processors
  • controllers micro-controllers
  • microprocessors other electronic units designed to perform the functions described above, and/or a combination thereof.
  • Software can be used instead of or
  • the embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged.
  • a process is terminated when its operations are completed, but could have additional steps not included in the figure.
  • a process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
  • embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium.
  • a code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements.
  • a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein.
  • Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein.
  • software codes may be stored in a memory.
  • Memory may be implemented within the processor or external to the processor.
  • the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
  • ROM read only memory
  • RAM random access memory
  • magnetic RAM magnetic RAM
  • core memory magnetic disk storage mediums
  • optical storage mediums flash memory devices and/or other machine readable mediums for storing information.
  • machine-readable medium includes, but is not limited to portable or fixed storage devices, optical storage devices, wireless channels, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mathematical Physics (AREA)
  • Storage Device Security (AREA)

Abstract

In another embodiment, a method for securing a field-programmable logic chip or circuit (FPLC) is disclosed. Information is cryptographically processed within the FPLC. An error condition is detected outside of the FPLC and the error condition is communicated to the FPLC to disrupt an image(s) within the FPLC. Optionally, at least a portion of a key can be erased such that cryptographic processing is curtailed or eliminated.

Description

  • This application claims the benefit of and is a non-provisional of co-pending U.S. Provisional Application Ser. No. 61/026,438 filed on Feb. 5, 2008, which is hereby expressly incorporated by reference in its entirety for all purposes.
  • This application expressly incorporates by reference each of the following co-pending patent applications in their entirety for all purposes: U.S. application Ser. No. ______, filed on the same date as the present US application, entitled “Overlapping State Areas” (temporarily referenced by Attorney Docket No. 017018-014720US); U.S. application Ser. No. ______, filed on the same date as the present US application, entitled “Trusted Boot” (temporarily referenced by Attorney Docket No. 017018-014710US).
  • BACKGROUND
  • This disclosure relates in general to field-programmable logic chip or circuit (FPLC) and, but not by way of limitation, to FPLC used in traffic processing such as cryptographic processing.
  • Programmable circuitry is common in logic design, but achieving the level of security and high-assurance desired by the governments, individuals and certain businesses has been difficult. Programmability is seen as a risk to achieving security and high-assurance. This is especially true when programmability is utilized in the field. One could imagine the programmability feature being used to cause the FPLC to operate in a mode that would not have the required security.
  • Cryptographic circuitry can fail or be compromised. Where such a circuit enters an error mode, there is no recovery. Failure can result in insecure functioning that is not desirable. The cryptographic circuit can erase keys to prevent further activity. Even without keys, the cryptographic circuit can perform in undesirable modes when malfunctioning. With programmability, the risk of these malfunctions is greater.
  • FPLCs have many advantages over fixed circuitry, but cost is generally not one of the advantages. Various images and soft cores are loaded into FPLCs. The size of the FPLC is chosen to accommodate all the images needed for a given design. When smaller or fewer FPLCs are possible, the costs of producing a unit decreases.
  • SUMMARY
  • In another embodiment, a method for securing a field-programmable logic chip or circuit (FPLC) is disclosed. Information is cryptographically processed within the FPLC. An error condition is detected outside of the FPLC and the error condition is communicated to the FPLC to disrupt an image(s) within the FPLC. Optionally, at least a portion of a key can be erased such that cryptographic processing is curtailed or eliminated.
  • In one embodiment, the present disclosure provides a method for providing system security for a FPLC. The method includes cryptographically processing information within the FPLC and detecting an error condition. The detecting operation is performed external to the FPLC. The method also includes conveying into the FPLC the error condition. The method also includes disrupting an image in the FPLC for one or more soft cores loaded when the error condition is detected where the image cannot perform cryptographic processing after disruption even if keys are present.
  • In one embodiment, the present disclosure provides a cryptographic processing system for providing system security for a FPLC. The cryptographic processing system includes a cryptographic soft core at least partially within the FPLC that cryptographically processes information. The cryptographic processing system also includes a system security manager that detects an error condition. The detection is performed external to the FPLC, the cryptographic soft core in the FPLC is disrupted when the error condition is detected and the cryptographic soft core cannot perform cryptographic processing after disruption even if keys are present.
  • In one embodiment, the present disclosure provides a cryptographic processing system for providing system security for a FPLC. The cryptographic processing system includes a means for cryptographically processing information within the FPLC. The system also includes a means for detecting an error condition. The detection is performed external to the FPLC. The system also includes a means for disrupting an image in the FPLC for one or more soft cores loaded when the error condition is detected. The image cannot perform cryptographic processing after disruption even if keys are present.
  • Further areas of applicability of the present disclosure will become apparent from the detailed description provided hereinafter. It should be understood that the detailed description and specific examples, while indicating various embodiments, are intended for purposes of illustration only and are not intended to necessarily limit the scope of the disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present disclosure is described in conjunction with the appended figures:
  • FIGS. 1A, 1B and 1C depict block diagrams of embodiments of a cryptographic processor system;
  • FIG. 2A depicts a block diagram of an embodiment of a system security manager (SSM);
  • FIG. 2B depicts a block diagram of an embodiment of a fail-safe SSM;
  • FIG. 3 illustrates a flowchart of an embodiment of a process for booting a cryptographic processor system;
  • FIGS. 4A, 4B, 4C and 4D depict block diagrams of embodiments of a traffic processing system;
  • FIGS. 5A, 5B, 5C, 5D, 5E, 5F, 5G, and 5H depict diagrams of embodiments of a layout of a field-programmable logic chip (FPLC) implementing a traffic processor;
  • FIGS. 6A and 6B depict diagrams of embodiments of a state machine used to control the traffic processor; and
  • FIG. 7 illustrates a flowchart of an embodiment of a process for cryptographically processing information in a two state configuration.
  • In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
  • DETAILED DESCRIPTION
  • The ensuing description provides preferred exemplary embodiment(s) only, and is not intended to limit the scope, applicability or configuration of the disclosure. Rather, the ensuing description of the preferred exemplary embodiment(s) will provide those skilled in the art with an enabling description for implementing a preferred exemplary embodiment. It being understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope as set forth in the appended claims.
  • I. Trusted Boot
  • Referring first to FIG. 1A, a block diagram of an embodiment of a cryptographic processor system 100-1 is shown. There are a number of field-programmable logic chips or circuits (FPLCs) in this embodiment, which may be field-programmable gate arrays (FPGA), programmable logic devices (PLDs), complex PLDs (CPLDs), or any other circuit that can be programmed with some logic or a soft core after manufacture, for example, in the field. Programmability in the field includes programmability while manufacturing a system including the FPLC or programmability while the system is deployed with an end user. A FPLC is a circuit chip or die in it's own package or chips or dice in a multi-chip module.
  • A programmable logic image (PLI) is a soft core of functionality that can be programmed into a FPLC or otherwise implemented. The PLI could include general purpose processor, a state machine, an application specific processor, a cryptofunction, and/or configuration information and parameters. A number of PLIs may be in a single FPLC or a single PLI may be spread out among a number of FPLCs. The blocks shown in the figures can be combined or split in various embodiments.
  • A number of PLIs are used to process traffic or more specifically, cryptographically process traffic. Plain text information is received by the interface PLI 140 for encryption processing, and cipher text information is output by the interface PLI 140. Conversely, cipher text information is received by the interface PLI 140 for decryption processing, and plain text information is output by the interface PLI 140. The interface PLI 140 can pass information without cryptographic processing in some cases. For example, the three PLIs 104, 124, 140 in this embodiment could be implemented in one or two FPLCs with the first holding the initiator PLI, the second holding the interface PLI and the crypto PLI divided between the two FPLCs.
  • A crypto PLI 124 performs cryptographic processing in a traffic processing state. These are just representative blocks for performing cryptographic processing and could be combined or separated in various embodiments. If loaded into the same FPLC, PLIs can be isolated from each other with a buffer of unused cells and controlled interfaces between the isolated areas. Signals from one PLI can be kept from routing outside the isolated area except where a deliberate port is configured to pass information between PLIs. In this way, isolation can be achieved in the same device unless interaction through a port is desired.
  • Soft cores for the various PLIs in their various versions are held in a storage flash 108. These soft cores are sometimes referred to as images. Additionally, key fragments or layers can be held in the storage flash 108. The storage flash 108 can additionally hold software to boot and run any processor of the cryptographic processor system 100. Any type of flash memory or non-volatile can be used for the storage flash 108.
  • This embodiment also includes volatile memory 132. A processing core within the initiator PLI 104 loads software from the storage flash 108 and uses the volatile memory 132 for program operation and variable storage. RAM, SRAM or any type of volatile memory could be used for the volatile memory 132. Other embodiments could use non-volatile memory, for example, MRAM for the volatile memory 132.
  • A battery or other power source (not shown) is used to allow a battery-backed memory 106 to retain its contents even when main power is interrupted or lost. SRAM or DRAM can be used for the battery-backed memory. The battery-backed memory 106 may also hold key layers. In one embodiment, some key layers are stored in the storage flash 108 while others are stored on the battery-backed memory 106. Further, key layers can be held on a token that is removably coupled to the cryptographic processor system 100 through a token interface. Any type of non-volatile memory can be used for the storage flash 108, the battery-backed memory 106 or token.
  • The program load circuit 112 loads soft cores for the PLIs into one or more FPLCs. The program load circuit 112 could be implemented with a CPLD, for example. The various images or soft cores in the storage flash 108 are loaded into the programmable logic in a particular sequence that can be controlled by the program load circuit 112 and/or the loaded PLIs. Certain PLIs can be loaded into the programmable logic and later removed such that other PLIs can recover some of the programmable logic when the removed PLIs are not needed.
  • An initiator PLI 104 may assist in this process and perform other configuration once the soft core of the initiator PLI 104 is loaded and functioning. Certain configuration actions such as loading keys, built-in test and other housekeeping functions can be performed by the initiator PLI 104. In some embodiments, the soft core of the initiator PLI 104 may be removed if not needed and reloaded when it is needed. Memory can be used to pass information from the initiator PLI 104 to other PLIs to use when the initiator PLI is not loaded.
  • As further described below, the system security manager (SSM) 116 monitors for errors and alarms before taking remedial action on the PLIs, FPLCs and/or key layers. Redundancy can be used by the SSM 116 to operate in a failsafe, trusted and/or high-assurance manner.
  • Referring next to FIG. 1B, shows a block diagram of another embodiment of the cryptographic processor system 100-2 that uses a fail-safe SSM 118 instead of a SSM 116. The fail-safe SSM 118 includes redundancy and/or other high-assurance circuits as further described below. This embodiment also includes a processor chip 128 instead or in addition to a processing soft core within the initiator PLI 104. The processor chip 128 is a hardware processor separate from the FPLC holding the initiator PLI 104. Volatile memory 132 is used by the processor chip 128 for program operation and variable storage.
  • With reference to FIG. 1C, a block diagram of yet another embodiment of a cryptographic processor system 100-3 is shown. This embodiment includes a security manager PLI 136 that is embedded in a FPLC. Some or all of the FPLCs used in the cryptographic processor system 100 could have its own security manager PLI 136. The security manager PLI could be used in addition to a SSM 116 or a fail-safe SSM 118 in various embodiments. The security manager PLI 136 is further described below.
  • Referring next to FIG. 3, a flow chart of an embodiment of a process 300 for booting a cryptographic processor system 100 is shown. The depicted portion of the process begins in block 304 where the storage flash 108 is loaded with one or more default images. Some FPLCs have a decryption algorithm built into the chip to be programmed that uses a fixed key to allow decryption of images loaded into that chip. For example, Xilinx™ and Altera™ could provide on-chip advanced encryption standard (AES) decryption of images using a predetermined key that is fixed and battery-backed on the chip. Block 304 would include encrypting the default image(s) with the appropriate key prior to storage in the storage flash 108.
  • The default images provide just enough logic to get the cryptographic processor system 100 running in a configuration state, but not enough to allow data throughput in this embodiment. For example, the initiator PLI 104 can be a default image in some embodiments. The default images support built-in-test to allow checking that the logic circuitry at least has some functionality and that the circuit card was assembled properly. Other embodiments could do certain unclassified traffic processing with the default images. A loaded default image allows running and updating of the operational software. Additionally, the default images include another decryption algorithm that is unclassified. This unclassified decryption algorithm is a soft core that can be loaded into FPLC to allow decrypt and load of additional logic in the same chip or another chip.
  • In the United States, the government classifies certain cryptographic algorithms while others are unclassified. Classified algorithms are not available to the general public and are controlled by government regulation. Each country can have their own classified and unclassified cryptographic algorithms which may vary from other countries. As the control of classified and unclassified algorithms differs in a particular country, embodiments use a mixture in certain circumstances.
  • In block 308, protected images are loaded into the storage flash 108. These may be optionally encrypted to allow decoding with the on-chip decryption algorithm. The protected images are encrypted to allow decryption with the unclassified decryption algorithm and an appropriate key. The protected images allow full cryptographic processing when properly loaded and enabled with the appropriate key(s). In one embodiment, a particular portion of a FPLC or a whole programmable FPLC could start out with a default image that is later replaced wholly or in part with another protected or unprotected image. Other embodiments could keep some or all of the default image functioning alongside protected images.
  • Operation of the cryptographic processor system 100 is begun in block 316. Prior blocks 304, 308 and 312 could be performed at the factory in one embodiment. Block 312 is likely to be done at least partially in the field as layers of the multi-layered key may change over time or be erased. Booting may be begun by application of power to the cryptographic processor system 100 or by a reset operation or other remedial action.
  • One or more default images are retrieved from the image flash storage 108 and loaded into programmable logic by the program load circuit 112 in block 320. For example, the initiator PLI 104 could be loaded as a default image. The default image(s) may be encrypted and a previously-loaded default image, such as the initiator PLI 104, could automatically decrypt the default image before loading into the programmable logic. Other embodiments may use the cryptographic function built into some FPLCs as explained above. Yet other embodiments could use the built-in cryptographic function and a soft core cryptographic function to utilize double decryption. Embodiments may have various blocks implemented in one or more PLIs to separate different functions within a given FPLC.
  • Once the initiator PLI 104 has the default image loaded in unencrypted form, a general purpose processor is available as a soft core that was part of the default image. Other embodiments could use a hard core for the processor used by the initiator PLI and external to any FPLC. Memory is available for processing and retrieval of operational software that is booted in block 324. Once the default software is loaded there is enough intelligence to accept new software and/or keys, but no cryptographic processing of traffic can be done in this embodiment of the default software. In this embodiment, the initiator PLI 104 has the unclassified decryption algorithm as a soft core or available in software. The PLIs operate with default images in block 324.
  • The initiator PLI retrieves the remainder of the default images in block 326. Those default images are loaded using optional decryption before built-in-test is performed on each of the other PLIs. During the loading process, verification of checksums, CRCs or hashes can be performed to confirm the default images were loaded correctly. The initiator PLI 104 can reload the default images where there is an error in the checksum, CRC or hash.
  • A determination in block 328 analyzes whether there are multi-layer keys present to continue booting the cryptographic processor system 100. The multi-layer key is reconstituted by retrieving various layers or portions from different locations. For example, the storage flash 108, the token and/or the battery-backed memory 106 may all have a portion of the multi-level key. Where one or more layers of the multi-layer key is missing processing loops back to block 312 to wait for loading of a missing key layer(s). If the multi-layer key is present, processing goes from block 328 to 332, but goes from block 328 to block 312 if the multi-layer key is missing.
  • The multi-layered key(s) are loaded into the cryptographic processor system 100 in block 312. The multi-layered key has multiple portions that are all needed by the unclassified decryption algorithm to decrypt the protected images. Layers of the multi-layered key(s) can be erased if error conditions are found. Without all of the layers, the multi-layered key will not be usable. A battery-backed memory 106 and/or flash may be used to store the various layers of the multi-layered key. This embodiment stores various layers or values of the keys in the storage flash 108, the battery-backed memory 106 and/or a token coupled to a token interface. The layers may additionally be encrypted, for example, the layer retrieved from the storage flash 108 is encrypted in one embodiment. Some locations may store multiple layers or portions of the multi-layer key, for example, the battery-backed memory 106 could have two portions that can be individually erased or scrambled to destroy the multi-layer key. The condition for destroying each layer of the key could be different to protect against different threats. Once the multi-layered key is loaded, processing goes from block 312 back to block 328.
  • In block 332, the initiator PLI 104 loads a soft core version of the unclassified decryption algorithm. The unclassified decryption algorithm could be AES, triple-DES or any other appropriate algorithm. The multi-layer key is constituted by retrieving layers from two or more locations. The protected soft cores are retrieved from the storage flash 108 and decrypted by the initiator PLI 104 in block 336. Some embodiments may also use the AES encryption built into the FPLC to utilize double decryption. That second order of protection of the AES encryption would involve decryption again as the protected image is loaded into a particular FPLC. Additionally, the initiator PLI 104 may calculate a checksum or hash as an soft core is loaded. The initiator PLI 104 can compare the checksum or hash against a predetermined value to confirm the soft core was loaded correctly. The initiator PLI 104 can reload the protected image where there is an error in the checksum or hash.
  • The PLIs loaded with their default or protected images can be tested various built in tests (BIT) in block 338. For example, known result verifications could be performed where a known input is fed to one or more PLIs to determine if a known output is produced during a known answer verification. Other possible testing of the PLIs can be performed such as scan chain testing, check words, check sums, boundary scan, integrity test, and other tests. After completion of block 338, the cryptographic processor system 100 is in a trusted state. Beyond testing, the trusted state is implemented with redundancy in this embodiment.
  • Monitoring is performed in block 344 for error conditions after the known answer and other verifications. Where they are conditions that could indicate a security concern, one or more layers of the multi-layer key are deleted in block 348 before looping back to block 312 to wait for a new key layer load, which may be manually or automatically performed. From block 312, the booting process could loop back to either block 316 or block 338 depending on the severity of the error. With other errors detected in block 344, processing goes from block 344 to block 338 to test the PLIs again without removal of one or more key layers. In the absence of errors detected in block 344, the cryptographic processor system 100 is available to encrypt and decrypt traffic in a fully operational mode in block 340. Testing continues in block 344 during operation either periodically, upon certain events or when errors are suspected.
  • II. System Security Manager
  • FPGAs and/or FPLCs may have security manager PLI 136 for programming into the FPLC. For example, Xilinx™ and Altera™ have envisioned a security manager PLI 136. The security manager PLI 136 is a soft core in this embodiment. Periodically, the security manager PLI 136 can check the other soft cores loaded into the FPLC to confirm they match what was originally loaded. Changes in the loaded programming could be detected in this manner. When an error condition is detected, the security monitor can erase the programming from within the FPGA or FPLC such that it returned to an inoperable state. The programmed-in security manager PLI 136 is not redundant. Additionally, these approaches presume that the security manager PLI 136 is operating properly despite other problems detected within the FPLC.
  • With reference to FIG. 2A, an embodiment of a SSM 116 is shown. In one embodiment, we include a SSM 116, that has circuitry external to the FPLCs housing the PLIs. The SSM 116 can activate the internal security monitoring circuit to erase the programming in a FPLC and/or may just overwrite the programming, reset the logic or otherwise prevent further operation of the FPLC. The SSM 116 can work in conjunction with the security monitor or replace the function of the security monitor. In this embodiment, the SSM 116 is outside the FPLC used for other PLIs, but could be implemented in an ASIC, FPGA, CPLD, or PLD. In this embodiment, a CPLD is used to implement the SSM 116 and is not field reprogrammable. Other soft cores or PLIs could be included in the FPLC used by the SSM 116 in other embodiments.
  • Many conditions are observed by the SSM 116, that may cause security measures to take place. Things like battery voltage over or under specification, tamper of any circuitry or enclosure, alarm conditions, triggering of a FPGA's or FPLC's security monitor are all conditions that are observed. Based on an analysis of the threat, the SSM 116 can erase/overwrite/reset PLIs and/or FPLCs, keys, and/or key portions or layers. For example, the system security monitor 116 may receive an indication that a particular FPGA security monitor found a single point failure and erased the FPGA. The system security monitor 116 could respond by writing an erasing or initialization program into the FPGA before reprogramming it once again. Certain conditions only result in erasing and/or reprogramming a portion of a PLI, a whole PLI, multiple PLIs, a FPLC, or multiple FPLCs.
  • This embodiment has a number of security functions that are activated based upon how the inputs are interpreted by a threat analysis circuit 216. The threat analysis circuit 216 can activate an erase circuit 204, a PLI wipe circuit 212 and/or a re-fill and test PLI (RATP) circuit 208. The erase key circuit 204 could erase, overwrite and/or otherwise disable a key. A layer of a multi-layer key may be erased or overwritten to effectively disable use of the multi-layer key even though other layers of the multi-layer key are still available. Where keys should be disabled, for example, the erase key circuit 204 can erase layers in the battery-backed memory 106, the token and/or the storage flash 108.
  • The testing of the PLI in the RATP circuit 208 can be built-in-test, security monitor tests, known answer verifications or the like. Re-filling of the PLI can be of the original image, a default image or a null image. Although this embodiment of the RATP circuit 208 and PLI wipe circuit 212 operate at the PLI level, other embodiments could also optionally operate on one or more FPLCs.
  • Referring next to FIG. 2B, an embodiment of a fail-safe SSM 118 is shown that uses redundant SSMs 116. The embodiment of the SSM 116 in FIG. 2A does not operate in a failsafe mode that is trusted according to some criterions. This embodiment duplicates of all the circuitry in the SSM 116 such that both copies would have to perform the same way or that would cause an alarm condition. In this way, any failure of one of the SSMs 116 would cause an erasure of the various PLIs, FPLCs, key layers, and/or keys. A consistency check circuit 216 compares inputs and outputs of the parallel SSMs 116 to assure there are matches. If one SSM 116 goes haywire, the PLI containing the fail-safe SSM 118 can be erased after possibly erasing one or more keys or key layers.
  • III. Overlapping State Areas for Programmable Crypto Processing Circuits
  • High-assurance and classified applications generally avoid use of PLIs or FPLCs. There are concerns that the reprogramability of these devices will leave them vulnerable to compromise. When operating these devices certain logic circuits are only used in certain states. For example, crypto processing systems configure the traffic engine before operating the traffic engine to process information. An embodiment reuses at least some of the same device resources for a configuration state and a cryptographic processing state.
  • Referring to FIG. 4A, a block diagram of an embodiment of a traffic processing system 400-1 is shown. The traffic processing could be cryptographic or other processing on data. This embodiment processes data in a manner that can tolerate delays associated with switching into a configuration state when necessary to perform configuration and any key management before switching back to a traffic processing state to operate upon more data. The traffic processing system 400 has a program load circuit 112 that loads multiple soft cores into the traffic processor 404 from the storage flash 108. The program load circuit loads soft cores as a function of the operational state of the traffic processor 404.
  • Included in the traffic processor 404 at various times are a traffic processing soft core 424, a configuration processing soft core 416, a persistent soft core 408, traffic ports 418, a program memory 412, and a configuration information store 420. Soft cores are outlined in the figure with dashed lines and are loaded into a FPLC as images by the program load circuit 112. The program memory 412 holds software for execution by the configuration processing soft core 416. The software can be loaded by the program load circuit 112. One or more storage media within the FPLC that implements the traffic processor are used for the program memory and the configuration information store 420 in this embodiment.
  • The program load circuit 112 has pointers to know where the various images are loaded in the storage flash 108 for the various states. The next state is communicated to the program load circuit 112 and the pointer is found to know which addresses from the storage flash 108 to feed into the traffic processor FPLC 404. The stream of programming information is fed from the storage flash 108 into the programming interface of the FPLC by the program load circuit.
  • The configuration processing soft core 416 performs configuration for the other states of operation, for example, key loading and management, decryption of classified images, built-in test, etc. In this embodiment, the configuration processing soft core 416 includes a processor, but other embodiments could perform the same actions without use of a processor. The produced configuration information is recorded in the configuration information store 420 and includes various things such as decoded keys, operational parameters, cryptographic algorithm variables, data ports to use, configuration of data passed to/from the traffic processing soft core 424.
  • The persistent soft core 408 could be used for loading PLIs or soft cores, optionally decrypting PLIs or soft cores, managing keys and security, and/or a state machine for flipping between the various cores used by the various states. In this embodiment, the persistent soft core 408 aids in loading images and storing semaphores or parameters that are passed between states. The state machine for flipping between various cores could be external to the traffic processor 404 in some embodiments. Other embodiments could have transitions to another state decided by a loaded PLI or soft core. For example, when the traffic processing soft core 404 detected an error that required reset, it could trigger loading the configuration processing soft core 416 and pass it the appropriate parameters.
  • For FPLCs that have partial reconfiguration, the state machine and image loading logic can be within the FPLCs to allow the state machine to remain during reprogramming of the FPLC. Other embodiments could have the image loading logic external when the FPLC does not support partial reconfiguration. The reconfiguration of the entire FPLC could triggered by a state machine that would be overwritten in the process of reconfiguring. The new configuration could have a new state machine capable of triggering a transition to another state that would use the external image loading logic to load a new image into the FPLC.
  • Once configuration is complete, this embodiment has no further need for a general-purpose processor. The traffic processing soft core 424 and traffic ports 418 can be loaded by the program load circuit and consume some of the same resources in the device that were consumed by the configuration processing soft core 416. The traffic processing soft core 424 does not use any software and instead is controlled by state engines in this embodiment. Information needed for the traffic processing soft core 424 are available from the configuration information store 420 in the traffic processing state. The traffic ports 418 are used by the traffic processing soft core 424 to send and receive information with the traffic processor 404. Other embodiments could avoid use of general-purpose processors for any state and rely on state machines for each state instead.
  • If additional configuration is needed at some point, data processing temporarily ceases and the traffic processor 404 returns to the configuration state. Data for processing could be buffered or otherwise delayed until return to the traffic processing state once the configuration state completes. The traffic processor 404 flops between states and the program load circuit 112 loads the necessary soft cores to allow operation with a reduction in resources for the device.
  • With reference to FIG. 4B, a block diagram of another embodiment of the traffic processing system 400-2 is shown. This embodiment uses a volatile memory 132 external to the FPLC implementing the traffic processor 404 to store the program memory 412 and the configuration information store 420. A memory interface (not shown) can be another soft core that is used to interact with the volatile memory 132.
  • Referring next to FIG. 4C, a block diagram of yet another embodiment of the traffic processing system 400-3 is shown. This embodiment of the traffic processing system 400-3 has an initiator PLI 104 as its configuration processing soft core 416. Additionally, the traffic processing soft core 424 is replaced with the crypto PLI 124. The functionality of the initiator PLI 104 and the traffic processing soft core 424 is described above.
  • With reference to FIG. 4D, a block diagram of still another embodiment of the traffic processing system 400-4 is shown. In this embodiment, the configuration information store 420 is retained within the FPLC of the traffic processor 404. The FPLC could have embedded memory or could use a soft core to implement memory to hold the configuration information between states.
  • With reference to FIGS. 5A & 5B, diagrams of an embodiment of a layout of a traffic processor 404 is shown in two different states. This embodiment of the traffic processor includes memory within the FPLC. The memory is used to store the software memory 412 and the configuration information 420. FIG. 5A shows the loaded soft cores used in a configuration state, and FIG. 5B shows the soft cores loaded in a traffic processing state for this embodiment. The configuration state in FIG. 5A, has a persistent soft core 408, a configuration processing soft core 416 in addition to the onboard memory. In transitioning to the traffic processing state shown in FIG. 5B, the configuration processing soft core 416 is overwritten with a traffic processing soft core 424 and traffic ports 418. Traffic ports 418 are not used in the configuration state 604 as these traffic ports are used to pass traffic.
  • Referring next to FIGS. 5C & 5D, diagrams of another embodiment of a layout of a traffic processor 404 is shown in two different states. The layout of the soft cores in a configuration state is shown FIG. 5C, and the layout of the soft cores in a traffic processing state is shown in FIG. 5D. This embodiment uses an external battery-backed memory 106. A memory interface soft core 508 is present in both states to interface to the battery-backed memory 106. Processing can flip back and forth between the two states as necessary during normal operation. Certain soft cores partially or wholly overlap for different states.
  • With reference to FIGS. 5E & 5F, diagrams of one embodiment of a layout of a traffic processor 404 is shown in two different states. This traffic processor 404 performs cryptographic processing, for example, using the cryptographic processor system 100. In a first state shown in FIG. 5E, an initiator PLI 104 is used to perform configuration, key management, etc. The crypto PLI 124 and interface PLI 140 are loaded in a second state in a manner that overlaps partially or wholly the initiator PLI 104. Operation can move between these states during normal operation of the traffic processor 404.
  • Referring next to FIGS. 5G & 5H, diagrams of another embodiment of a layout of a traffic processor 404 is shown in two different states. This traffic processor 404 performs cryptographic processing, for example, using the cryptographic processor system 100. Operation moves between at least two states, with one having an initiator PLI 104 loaded and the other having a crypto PLI 124 loaded. In this embodiment, there is no overlap between the PLIs swapped out between states. While in one state there are one or more PLIs that are inoperable unless there is a transition to the state that would use that PLI.
  • Referring next to FIG. 6A, an embodiment of a state machine 600-1 used to control the traffic processor 404 is shown. This embodiment operates in two states, namely, a configuration state 604 and a traffic processing state 608. After reset or boot, processing enters the configuration state 604. Once configuration is complete, processing goes to the traffic processing state 608. When more configuration is needed, processing goes from the traffic processing state 608 to the configuration state 604 before returning back to the traffic processing state 608. Appropriate images are loaded into the device when switching between states. The size of the FPLC is dictated by the largest amount of logic in any state.
  • With reference to FIG. 6B, another embodiment of the state machine 600-2 used to control the traffic processor 404 is shown. This embodiment controls a cryptographic processor system 100 in three states. A control and key management state 612 and a cryptographic processor setup state 616 use a configuration processing soft core 416, but a cryptographic processing state 620 has no need for a general-purpose processor. Between each state, different soft cores can be loaded into the cryptographic processor system 100. In some embodiments, the control and key management state 612 and a cryptographic processor setup state 616 may use the same soft cores such that loading images between states is not required.
  • Referring next to FIG. 7, an embodiment of a flow diagram showing a process 700 for cryptographically processing information in a two state configuration. The depicted portion of the process begins in block 704 where the storage flash 108 is loaded with configuration images used for the various soft cores. In block 708, the storage flash 108 is loaded with the traffic state images used in the various soft cores for that state. Additionally, software for a general purpose processing core can be loaded into the storage flash 108 or directly into the program memory 412.
  • Normal operation of the cryptographic processor system 100 begins in block 712 when the PLI or FPLC is reset or powered-up. The configuration image(s) are loaded into the device in block 716. The traffic processor 404 begins operation in the configuration state to initialize the PLIs in block 720 to begin the configuration state and assembles configuration information for use in other state(s) in block 724. The configuration information is stored in the configuration information store 420 in block 728.
  • Transitioning into block 732 activates the traffic processing state by loading traffic processing soft core(s) into the traffic processor 404. Loading of new soft cores may be preceded by overwriting and/or erasing of prior soft cores or may simply be accomplished by writing the new soft cores over the prior soft cores. In the traffic processing state, the traffic image is configured with the stored traffic configuration information upon activating the traffic processing soft core in block 736. Processing traffic takes place in block 740. Other embodiments could perform cryptographic processing in block 740, for example, using the cryptographic processor system 100. So long as configuration is not needed or errors are not detected in block 744, traffic processing continues. Any error detected in block 744 is recorded in block 748 before looping back to block 712 where the device is rebooted. Other embodiments could take any number of remedial measures depending on the error encountered, for example keys or a key layer(s) could be destroyed in a cryptographic application.
  • A number of variations and modifications of the disclosed embodiments can also be used. For example, functions that are implemented in PLIs could be performed in software. Code could be run on soft cores within a FPLC or a pre-programmed microprocessor circuit.
  • Specific details are given in the above description to provide a thorough understanding of the embodiments. However, it is understood that the embodiments may be practiced without these specific details. For example, circuits may be shown in block diagrams in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
  • Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), soft core processors, hard core processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof. Software can be used instead of or in addition to hardware to perform the techniques, blocks, steps and means.
  • Also, it is noted that the embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
  • Furthermore, embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in a memory. Memory may be implemented within the processor or external to the processor. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • Moreover, as disclosed herein, the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information. The term “machine-readable medium” includes, but is not limited to portable or fixed storage devices, optical storage devices, wireless channels, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
  • While the principles of the disclosure have been described above in connection with specific apparatuses and methods, it is to be clearly understood that this description is made only by way of example and not as limitation on the scope of the disclosure.

Claims (23)

1. A method for providing system security for a field-programmable logic chip (FPLC), the method comprising:
cryptographically processing information within the FPLC;
detecting an error condition, wherein the detection is performed external to the FPLC;
conveying into the FPLC the error condition; and
disrupting an image in the FPLC for one or more soft cores loaded when the error condition is detected where the image cannot perform cryptographic processing after disruption even if keys are present.
2. The method as recited in claim 1, wherein the key is used to program the image into the FPLC.
3. The method as recited in claim 1, wherein the conveying is performed by disrupting programming of the FPLC.
4. The method as recited in claim 1, further comprising erasing at least a portion of a key used for the cryptographic processing.
5. The method as recited in claim 1, wherein the key is broken into portions stored in at least two different memories.
6. The method as recited in claim 1, wherein the disrupting the image includes writing a different image into the FPLC over at least part of the image.
7. The method as recited in claim 1, wherein the disrupting the image is performed by sate machine.
8. The method as recited in claim 1, further comprising detecting the error condition using circuitry that is redundantly implemented.
9. A cryptographic processing system for providing system security for a field-programmable logic chip (FPLC), the cryptographic processing system comprising:
a cryptographic soft core at least partially within the FPLC that cryptographically processes information;
a system security manager that detects an error condition, wherein:
the detection is performed external to the FPLC,
the cryptographic soft core in the FPLC is disrupted when the error condition is detected where the image, and
the cryptographic soft core cannot perform cryptographic processing after disruption even if keys are present.
10. The cryptographic processing system as recited in claim 9, wherein the key is used to program the cryptographic soft core into the FPLC.
11. The cryptographic processing system as recited in claim 9, wherein at least a portion of a key used for the cryptographic processing is disrupted after detecting the error condition
12. The cryptographic processing system as recited in claim 9, wherein the key is broken into portions stored in at least two different memories.
13. The cryptographic processing system as recited in claim 9, further comprising a security manager programmable logic image (PLI) internal to the FPLC.
14. The cryptographic processing system as recited in claim 9, wherein the disrupting the image includes writing a different image into the FPLC.
15. The cryptographic processing system as recited in claim 9, wherein the disrupting the image is performed by sate machine.
16. The cryptographic processing system as recited in claim 9, further comprising detecting the error condition using circuitry that is redundantly implemented.
17. A cryptographic processing system for providing system security for a field-programmable logic chip (FPLC), the cryptographic processing system comprising:
means for cryptographically processing information within the FPLC;
means for detecting an error condition, wherein the detection is performed external to the FPLC;
means for disrupting an image in the FPLC for one or more soft cores loaded when the error condition is detected where the image cannot perform cryptographic processing after disruption even if keys are present.
18. The cryptographic processing system as recited in claim 17, wherein the key is used to program the image into the FPLC.
19. The cryptographic processing system as recited in claim 17, further comprising means for erasing at least a portion of a key used for the cryptographic processing.
20. The cryptographic processing system as recited in claim 17, wherein the key is broken into portions stored in at least two different memories.
21. The cryptographic processing system as recited in claim 17, wherein the disrupting the image includes means for writing a different image into the FPLC.
22. The cryptographic processing system as recited in claim 17, wherein the disrupting the image is performed by sate machine.
23. The cryptographic processing system as recited in claim 17, wherein the means for detecting the error condition uses circuitry that is redundantly implemented.
US12/366,600 2008-02-05 2009-02-05 System security manager Abandoned US20090240951A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/366,600 US20090240951A1 (en) 2008-02-05 2009-02-05 System security manager

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US2643808P 2008-02-05 2008-02-05
US12/366,600 US20090240951A1 (en) 2008-02-05 2009-02-05 System security manager

Publications (1)

Publication Number Publication Date
US20090240951A1 true US20090240951A1 (en) 2009-09-24

Family

ID=40932881

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/366,600 Abandoned US20090240951A1 (en) 2008-02-05 2009-02-05 System security manager
US12/366,619 Active 2030-10-18 US8156321B2 (en) 2008-02-05 2009-02-05 Overlapping state areas for programmable crypto processing circuits
US12/366,602 Active 2030-10-13 US8166289B2 (en) 2008-02-05 2009-02-05 Trusted boot

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12/366,619 Active 2030-10-18 US8156321B2 (en) 2008-02-05 2009-02-05 Overlapping state areas for programmable crypto processing circuits
US12/366,602 Active 2030-10-13 US8166289B2 (en) 2008-02-05 2009-02-05 Trusted boot

Country Status (3)

Country Link
US (3) US20090240951A1 (en)
EP (1) EP2255292A4 (en)
WO (1) WO2009100249A2 (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9396358B1 (en) * 2010-01-19 2016-07-19 Altera Corporation Integrated circuit with a self-destruction mechanism
US10270709B2 (en) 2015-06-26 2019-04-23 Microsoft Technology Licensing, Llc Allocating acceleration component functionality for supporting services
US9355279B1 (en) 2013-03-29 2016-05-31 Secturion Systems, Inc. Multi-tenancy architecture
US9374344B1 (en) 2013-03-29 2016-06-21 Secturion Systems, Inc. Secure end-to-end communication system
US9317718B1 (en) 2013-03-29 2016-04-19 Secturion Systems, Inc. Security device with programmable systolic-matrix cryptographic module and programmable input/output interface
US9798899B1 (en) 2013-03-29 2017-10-24 Secturion Systems, Inc. Replaceable or removable physical interface input/output module
US9524399B1 (en) * 2013-04-01 2016-12-20 Secturion Systems, Inc. Multi-level independent security architecture
US9658858B2 (en) 2013-10-16 2017-05-23 Xilinx, Inc. Multi-threaded low-level startup for system boot efficiency
US9830456B2 (en) 2013-10-21 2017-11-28 Cisco Technology, Inc. Trust transference from a trusted processor to an untrusted processor
US9727722B2 (en) * 2015-02-23 2017-08-08 Cisco Technology, Inc. Non-intrusive monitoring
US10198294B2 (en) 2015-04-17 2019-02-05 Microsoft Licensing Technology, LLC Handling tenant requests in a system that uses hardware acceleration components
US10511478B2 (en) 2015-04-17 2019-12-17 Microsoft Technology Licensing, Llc Changing between different roles at acceleration components
US9792154B2 (en) 2015-04-17 2017-10-17 Microsoft Technology Licensing, Llc Data processing system having a hardware acceleration plane and a software plane
US10296392B2 (en) 2015-04-17 2019-05-21 Microsoft Technology Licensing, Llc Implementing a multi-component service using plural hardware acceleration components
US10216555B2 (en) 2015-06-26 2019-02-26 Microsoft Technology Licensing, Llc Partially reconfiguring acceleration components
US11283774B2 (en) 2015-09-17 2022-03-22 Secturion Systems, Inc. Cloud storage using encryption gateway with certificate authority identification
US9794064B2 (en) 2015-09-17 2017-10-17 Secturion Systems, Inc. Client(s) to cloud or remote server secure data or file object encryption gateway
US10708236B2 (en) 2015-10-26 2020-07-07 Secturion Systems, Inc. Multi-independent level secure (MILS) storage encryption
CA2944306C (en) 2015-10-30 2023-11-14 The Toronto-Dominion Bank Validating encrypted data from a multi-layer token
US11216808B2 (en) * 2015-11-04 2022-01-04 The Toronto-Dominion Bank Token-based system for excising data from databases
US10552831B2 (en) * 2015-11-05 2020-02-04 The Toronto-Dominion Bank Securing data via multi-layer tokens
US11831786B1 (en) 2018-11-13 2023-11-28 Northrop Grumman Systems Corporation Chain of trust
US11157626B1 (en) 2019-05-29 2021-10-26 Northrop Grumman Systems Corporation Bi-directional chain of trust network

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010032318A1 (en) * 1999-12-03 2001-10-18 Yip Kun Wah Apparatus and method for protecting configuration data in a programmable device
US6356637B1 (en) * 1998-09-18 2002-03-12 Sun Microsystems, Inc. Field programmable gate arrays
US20020150252A1 (en) * 2001-03-27 2002-10-17 Leopard Logic, Inc. Secure intellectual property for a generated field programmable gate array
US6681354B2 (en) * 2001-01-31 2004-01-20 Stmicroelectronics, Inc. Embedded field programmable gate array for performing built-in self test functions in a system on a chip and method of operation
US20050018371A1 (en) * 2003-06-13 2005-01-27 Mladenik John E. Systems and methods for fault-based power signal interruption
US20050144524A1 (en) * 2003-12-04 2005-06-30 International Business Machines Corporation Digital reliability monitor having autonomic repair and notification capability
US20050240591A1 (en) * 2004-04-21 2005-10-27 Carla Marceau Secure peer-to-peer object storage system
US20060059369A1 (en) * 2004-09-10 2006-03-16 International Business Machines Corporation Circuit chip for cryptographic processing having a secure interface to an external memory
US20060265603A1 (en) * 2005-03-24 2006-11-23 Sony United Kingdom Limited Programmable logic device
US20070220369A1 (en) * 2006-02-21 2007-09-20 International Business Machines Corporation Fault isolation and availability mechanism for multi-processor system
US7318145B1 (en) * 2001-06-01 2008-01-08 Mips Technologies, Inc. Random slip generator
US7366306B1 (en) * 2002-03-29 2008-04-29 Xilinx, Inc. Programmable logic device that supports secure and non-secure modes of decryption-key access
US20080317002A1 (en) * 2007-06-19 2008-12-25 Boppana Rajendra V Tamper-resistant communication layer for attack mitigation and reliable intrusion detection
US20090119503A1 (en) * 2007-11-06 2009-05-07 L3 Communications Corporation Secure programmable hardware component
US20090147945A1 (en) * 2007-12-05 2009-06-11 Itt Manufacturing Enterprises, Inc. Configurable ASIC-embedded cryptographic processing engine
US7689726B1 (en) * 2004-10-01 2010-03-30 Xilinx, Inc. Bootable integrated circuit device for readback encoding of configuration data
US7844886B1 (en) * 2006-05-16 2010-11-30 Altera Corporation Parallel processing error detection and location circuitry for configuration random-access memory

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6378072B1 (en) * 1998-02-03 2002-04-23 Compaq Computer Corporation Cryptographic system
US20020061107A1 (en) * 2000-09-25 2002-05-23 Tham Terry K. Methods and apparatus for implementing a cryptography engine
US6996713B1 (en) * 2002-03-29 2006-02-07 Xilinx, Inc. Method and apparatus for protecting proprietary decryption keys for programmable logic devices
US7197647B1 (en) * 2002-09-30 2007-03-27 Carnegie Mellon University Method of securing programmable logic configuration data
US7366302B2 (en) * 2003-08-25 2008-04-29 Sony Corporation Apparatus and method for an iterative cryptographic block
US7376968B2 (en) * 2003-11-20 2008-05-20 Microsoft Corporation BIOS integrated encryption
US20060026417A1 (en) * 2004-07-30 2006-02-02 Information Assurance Systems L.L.C. High-assurance secure boot content protection
US8036379B2 (en) * 2006-03-15 2011-10-11 Microsoft Corporation Cryptographic processing
US7904711B2 (en) * 2007-10-24 2011-03-08 Harris Corporation Scaleable architecture to support high assurance internet protocol encryption (HAIPE)
US8095800B2 (en) * 2008-11-20 2012-01-10 General Dynamics C4 System, Inc. Secure configuration of programmable logic device

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6356637B1 (en) * 1998-09-18 2002-03-12 Sun Microsystems, Inc. Field programmable gate arrays
US20010032318A1 (en) * 1999-12-03 2001-10-18 Yip Kun Wah Apparatus and method for protecting configuration data in a programmable device
US6681354B2 (en) * 2001-01-31 2004-01-20 Stmicroelectronics, Inc. Embedded field programmable gate array for performing built-in self test functions in a system on a chip and method of operation
US20020150252A1 (en) * 2001-03-27 2002-10-17 Leopard Logic, Inc. Secure intellectual property for a generated field programmable gate array
US7318145B1 (en) * 2001-06-01 2008-01-08 Mips Technologies, Inc. Random slip generator
US7366306B1 (en) * 2002-03-29 2008-04-29 Xilinx, Inc. Programmable logic device that supports secure and non-secure modes of decryption-key access
US20050018371A1 (en) * 2003-06-13 2005-01-27 Mladenik John E. Systems and methods for fault-based power signal interruption
US20050144524A1 (en) * 2003-12-04 2005-06-30 International Business Machines Corporation Digital reliability monitor having autonomic repair and notification capability
US20050240591A1 (en) * 2004-04-21 2005-10-27 Carla Marceau Secure peer-to-peer object storage system
US20060059369A1 (en) * 2004-09-10 2006-03-16 International Business Machines Corporation Circuit chip for cryptographic processing having a secure interface to an external memory
US7689726B1 (en) * 2004-10-01 2010-03-30 Xilinx, Inc. Bootable integrated circuit device for readback encoding of configuration data
US20060265603A1 (en) * 2005-03-24 2006-11-23 Sony United Kingdom Limited Programmable logic device
US20070220369A1 (en) * 2006-02-21 2007-09-20 International Business Machines Corporation Fault isolation and availability mechanism for multi-processor system
US7844886B1 (en) * 2006-05-16 2010-11-30 Altera Corporation Parallel processing error detection and location circuitry for configuration random-access memory
US20080317002A1 (en) * 2007-06-19 2008-12-25 Boppana Rajendra V Tamper-resistant communication layer for attack mitigation and reliable intrusion detection
US20090119503A1 (en) * 2007-11-06 2009-05-07 L3 Communications Corporation Secure programmable hardware component
US20090147945A1 (en) * 2007-12-05 2009-06-11 Itt Manufacturing Enterprises, Inc. Configurable ASIC-embedded cryptographic processing engine

Also Published As

Publication number Publication date
US8156321B2 (en) 2012-04-10
WO2009100249A3 (en) 2009-11-26
US8166289B2 (en) 2012-04-24
WO2009100249A2 (en) 2009-08-13
EP2255292A4 (en) 2014-09-24
EP2255292A2 (en) 2010-12-01
US20090235064A1 (en) 2009-09-17
US20090198991A1 (en) 2009-08-06

Similar Documents

Publication Publication Date Title
US8156321B2 (en) Overlapping state areas for programmable crypto processing circuits
US10037438B2 (en) Setting security features of programmable logic devices
US7979826B1 (en) Computer-readable storage media comprising data streams having mixed mode data correction capability
US8209545B1 (en) FPGA configuration bitstream protection using multiple keys
US9111121B2 (en) Method and apparatus for securing a programmable device using a kill switch
US8892837B2 (en) Integrated circuit with tamper-detection and self-erase mechanisms
US7834652B1 (en) Method and devices for storing a security key using programmable fuses
US20080072070A1 (en) Secure virtual RAM
US20070237325A1 (en) Method and apparatus to improve security of cryptographic systems
US11436382B2 (en) Systems and methods for detecting and mitigating programmable logic device tampering
US20100125739A1 (en) Secure configuration of programmable logic device
JP5246863B2 (en) Logic program data protection system and protection method for reconfigurable logic device
US20090119503A1 (en) Secure programmable hardware component
US9641330B2 (en) Trusted tamper reactive secure storage
US10720927B1 (en) Selectively disabled output
US7607025B1 (en) Methods of intrusion detection and prevention in secure programmable logic devices
US7752407B1 (en) Security RAM block
CN104615953A (en) Programmable logic device enabling configuration data flows to be high in safety
Druyer et al. A survey on security features in modern FPGAs
US8983073B1 (en) Method and apparatus for restricting the use of integrated circuits
Peterson Developing tamper resistant designs with Xilinx Virtex-6 and 7 series FPGAs
US20150323919A1 (en) Method for operating a control unit
US20150324610A1 (en) Method for managing software functionalities in a control unit
Seely et al. Enabling military design security with high-performance FPGAs
Chen et al. In-place Logic Obfuscation for Emerging Nonvolatile FPGAs

Legal Events

Date Code Title Description
AS Assignment

Owner name: VIASAT, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OWENS, JOHN R;ANDOLINA, JOHN C.;SHANKEN, STUART;AND OTHERS;REEL/FRAME:022765/0717;SIGNING DATES FROM 20090426 TO 20090430

AS Assignment

Owner name: UNION BANK, N.A., CALIFORNIA

Free format text: SECURITY AGREEMENT;ASSIGNOR:VIASAT, INC.;REEL/FRAME:028184/0152

Effective date: 20120509

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION