_{1}

Biometric identification systems are principally related to the information security as well as data protection and encryption. The paper proposes a method to integrate biometrics data encryption and authentication into error correction techniques. The normal methods of biometric templates matching are replaced by a more powerful and high quality identification approach based on Grobner bases computations. In the normal biometric systems, where the data are always noisy, an approximate matching is expected; however, our cryptographic method gives particularly exact matching.

Digital data sent over communication channels are subject to distorting as a result of various circumstances such as electromagnetic fluctuations. Also, it might not be able to restore correct data from a hard disk or a digital audio (or video) system as most of the storage media is liable for errors. Another extreme example is the images remitted from space-probes, where a considerable error rate takes place and re-transmission is often not possible. The last example is the biometric feature vectors made of the attributes of the individuals which are noisy by nature. The consequence is that the digital data received (read or captured) may be not the same as the primarily sent (stored or enrolled). In this regard codes provide a methodical technique to transmit messages, with some supplementary information (check digits) in such a way that an error occurring in the original messages will not only be detected by the receiver, but in many cases, it could be corrected. More algebraic structures should be added to the code spaces to solve the decoding and encoding problems efficiently. In particular, linear codes are excessively exploited for controlling errors because they are well understood, powerful and easy to generate.

Biometrics can be defined as a largely automated measurement of physiological and behavioral characteristics that are used to prove or confirm identities of human beings [

Once a biometric trait is captured, residual random noise is removed by using filters (as a directed smoothing process); see for instance [

A Grӧbner basis as a vigorous tool initiated from a commutative algebra is defined as a set of polynomials computed, using Buchberger’s algorithm [

In this section, we give the substantial background required to understand the syndrome decoding problem and present some principles of linear codes. We also recall the elementary theory of Grӧbner basis algorithm for the case of multivariate polynomials. We show that the basic component of the Grӧbner basis theory is the concept of polynomial reduction that is used to compute the appropriately defined normal form of a specified polynomial.

Let

Let

is a linear code of length n and dimension k over

A generator matrix for the code

which produces the code

The parity-check binary matrix of such code

We define the syndrome

We introduce some basic definitions which we need to explain the Grӧbner basis theory. We only cite the theorem for the existence of the Grӧbner basis of an ideal. In 1965 Buchberger [

multivariate polynomials in commuting n variables over a computable field, with an introduction of Grӧbner basis. Furthermore, Mora [

multivariate polynomials in non-commuting n variables over a computable field. We can say that both Buchberger and Mora algorithms, which based on a generalization of the Euclidean division algorithm to several variables, use the reality that coefficients are in a specified field. Therefore, given any two polynomials f and g, we can write f as

then, using the generalized division algorithm, we can write a polynomial f as

while, in several variables case, there are numerous preferences for the term ordering, see [

Buchberger not only proved that each ideal has a basis for which the problem of ideal membership is computationally solvable, he also described an algorithm that can be exploited to get such a basis. Here, we should mention that polynomial reduction is the cornerstone in the Grӧbner bases algorithms as it represents the most intensive portion in terms of computations. In this regard we say that a polynomial f reduces to a different polynomial r, denoted as

we define

where

Of course, in non-commutative case the situation is more sophisticated since the monomials are words and there can be either more than one S-polynomial or none. A finite set of polynomials

Now we are in a position to give the layout of the Buchberger algorithm. It launches with the initial basis

then h is appended to the basis. The process, with other added technical details, is reiterated till we obtain a basis satisfying the condition that is mentioned above. In the commutative situation, Buchberger showed that the process constantly terminates and gives at the end a Grӧbner basis. On the other hand, Mora noted that the process in non-commutative case does not always terminate―but, when it does, it should produce a Grӧbner basis.

For the proofs of the existence and uniqueness of a Grӧbner basis (in fact reduced Grӧbner basis) G for an ideal

The interaction between coding theory and Grӧbner bases has been observed from the property that each function from

Let

where

In order to solve the syndrome decoding problem we need to find such linear combination that gives the syndrome vector.

Let

Then, we can recover the error vector as

For any two vectors

We can write the star product of two columns

where

The relation between

Now we can express the unknown syndromes in terms of the known syndromes as

Since the rank of

By representing

Let the unknowns

The ideal generated by the combined system defines the set of solutions that satisfies both systems. The reduced Grӧbner basis of the combined system with respect to a monomial ordering takes the form

where t is the smallest positive integer such that the system has solution (see the proof and some details on all of the above in Bulygin and Pellikaan [

In [

A biometric image is acquired and a feature vector is extracted from an enhance version of such image. Let

The authentication process is similar to the enrollment process for any new acquired biometric image. The result of such process is called the match score. Here, we assume that the number of errors is fewer than the code correcting capacity. Algorithm 2 shows how to verify (match)a feature vector of a new captured biometric trait.

Our technique is in fact different from conventional biometric authentication techniques which use numerical measure of the similarity of two biometric traits acquired at enrollment and verification steps. These conventional biometric systems require powerful digital signal processing algorithms in order to enhance the captured images before extracting the hidden characteristics. This process, which is called feature vector extraction, indeed plays the most critical part of biometrics identification. Our method is able to overcome most of the problems which might be resulting from the extraction of the biometric information as binary feature vectors from the realization of biometric traits.

The implementation (as a proof-of-concept prototype) of our more promising approach has been done using various feature vectors of fingerprints and palm vein images as test data. We evaluated the algorithm using samples of 50 different users, with 5 samples per each user. The algorithm was implemented using an interpreted code as well as several built-in functions of MAGMA [

Although the proposed approach does not yet fulfil the anticipated performance in terms of a Grӧbner basis computation complexity and latency, it does provide a low-cost secure biometric encryption architecture. On top of that it reveals various factors and provides beneficial insights that motivate the researchers in the area of integrating computational algebra with biometrics. The method is also suitable for other biometric systems such as iris biometrics that seem also very promising. All of these reasons make the algorithm feasible for different practical uses.

Even though many commercial and academic systems for biometrics identification are working out, the considerable number of publications on this domain states the necessity for extensive research for the sake of obtaining better performance and enhancing the reliability of such systems. In this paper, the problem for merging biometrics and cryptography was tackled. We used an algebraic method that allowed an exact recovery of a given a binary word (representing a biometric feature vector) using a randomly chosen word from a proposed code. We showed how to match a feature vector of a biometric trait by exploiting the theory of error-correcting codes over the field of two elements

MohamedSayed, (2015) Grobner Bases Method for Biometric Traits Identification and Encryption. Journal of Information Security,06,241-249. doi: 10.4236/jis.2015.63024