Two Factor Authentication - Engineering Seminar


Two Factor Authentication
INTRODUCTION
Authentication:
Authentication is the act of establishing or confirming something (or someone) as authentic, that is that claims made by or about the thing are true. Authenticating an object may mean confirming its provenance, whereas authenticating a person often consists of verifying their identity. Authentication depends upon one or more authentication factors.
In computer security, authentication is the process of attempting to verify the digital identity of the sender of a communication such as a request to log in. The sender being authenticated may be a person using a computer, a computer itself or a computer program. A blind credential, in contrast, does not establish identity at all, but only a narrow right or status of the user or program.
In a web of trust, authentication is a way to ensure users are who they say they are that the user who attempts to perform functions in a system is in fact the user who is authorized to do so.

Difference between Authentication and Authorization:
Authorization is often thought to be identical to that of authentication, many widely adopted standard of protocols, obligatory regulations, and even statutes are based on this assumption. 
However, more precise usage describes authentication as the process of verifying a person's identity, while authorization is the process of verifying that a known person has the authority to perform a certain operation. Authentication, therefore, must precede authorization. 
For example, when you show proper identification to a bank teller, you could be authenticated by the teller, and you would be authorized to access information about your bank accounts. You would not be authorized to access accounts that are not your own.

Authentication Factors:
The authentication factors humans are generally classified into four cases:
Something the user is (e.g., fingerprint or retinal pattern, DNA  sequence (there are assorted definitions of what is sufficient), voice pattern (again several definitions), signature recognition, unique bio-electric signals produced by the living body, or other biometric identifier)
Something the user has (e.g., ID card, security token, software token or cell phone)
Something the user knows (e.g., a password, a pass phrase or a personal identification number (PIN)).
Something the user does (e.g., voice recognition, signature, or gait).

e-Authentications:
It is defined as the Web Based service that provides authentication to end users accessing (logging into) an Internet service.
The e-Authentication is similar to Credit Card verification for eCommerce web sites. The verification is done by a dedicated service that receives the input and returns success or fails indication.
For example, an end user wishes to enter his e-Buy or e-Trade web site. He gets the Login web page and is required to enter his user ID and a Password or in the more secured sites – his One Time Password. 
The information is transmitted to the e-Authentication service as a query. If the service returns success–the end user is permitted into the e-Trade service with his privileges as a user.

Purpose of Authentication:
On August 8, 2001, the FFIEC agencies1 (agencies) issued guidance entitled Authentication in an Electronic Banking Environment (2001 Guidance).The 2001 Guidance focused on risk management controls necessary to authenticate the identity of retail and commercial customers accessing Internet-based financial services. 
Since 2001, there have been significant legal and technological changes with respect to the protection of customer information, to increase incidents of fraud, including identity theft and the introduction of improved authentication technologies. 
  This updated guidance replaces the 2001 Guidance and specifically addresses why financial institutions regulated by the agencies should conduct risk-based assessments, evaluate customer awareness programs, and develop security measures to reliably authenticate customers remotely accessing their Internet-based financial services.
  Financial institutions should use this guidance when evaluating and implementing authentication systems and practices whether they are provided internally or by a service provider. Although this guidance is focused on the risks and risk management techniques associated with the Internet delivery channel, the principles are applicable to all forms of electronic banking activities.
The agencies consider single-factor authentication, as the only control mechanism, to be inadequate for high-risk transactions involving access to customer information or the movement of funds to other parties. 

The authentication techniques employed by the financial institution should be appropriate to the risks associated with those products and services. Account fraud and identity theft are frequently the result of single-factor (e.g., ID/password) authentication exploitation. 

           Where risk assessments indicate that the use of single-factor authentication is inadequate, financial institutions should implement multifactor authentication, layered security, or other controls reasonably calculated to mitigate those risks. 

Consistent with the FFIEC Information Technology, December 2002, financial institutions should periodically: 
Ensure that their information security program: 
identifies and assesses the risks associated with Internet-based products and services.                                                                                                                                                                              
Identifies risk mitigation actions, including appropriate authentication                         
     Strength.
Measures and evaluates customer awareness efforts.
Adjust, as appropriate, their information security program in light of any      relevant changes in technology, the sensitivity of its customer information,                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           and internal or external threats to information.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       
Implement appropriate risk mitigation strategies.

Need for Strong Authentication:
Single-factor authentication usually consists of "something you know". However, generally, these could be susceptible to attacks that could compromise the security of the application. Some of the more common attacks can occur at little or no cost to the perpetrator and without detection. 
           Programs are readily available over the internet. If undetected, the perpetrator could access the information without alerting the legitimate user. This is the reason of using a strong user authentication process to protect the data and systems. The need for strong user authentication has many benefits.
           First, effective authentication provides the basis for validation of parties to the transaction and their agreement to its terms.
Second, it is a necessary element to establish authenticity of the records evidencing the electronic transaction should there ever be a dispute.
Third, it is a necessary element to establish the integrity of the records evidencing the electronic transaction. All of these elements promote the enforceability of electronic agreements.
 Financial institutions should assess the adequacy of existing authentication techniques in the light of changing or new perceived risks. According to the ICSA (International Computer Security Association), 80 per cent of system undermining occurs from within the organization. The Basle Committee on Banking Supervision advises financial institutions to consider the apparent risks of offering internet banking services based on PIN alone. Single factor authentication alone may not be commercially reasonable or adequate for high-risk applications and transactions.
Systems linked to open and entrusted networks like the internet are subject to a greater number of individuals who may attempt to compromise the system. Attackers may use automated programs to systematically generate millions of numerical combinations, in the case of systems relying on PIN alone, to learn a customer's access code (brute force attack).

Background of Authentication:
Financial institutions engaging in any form of Internet banking should have effective and reliable methods to authenticate customers. An effective authentication system is necessary for compliance with requirements to safeguard customer information, to prevent money laundering and terrorist financing, to reduce fraud, to inhibit identity theft, and to promote the legal enforceability of their electronic agreements and transactions. The risks of doing business with unauthorized or incorrectly identified persons in an Internet banking environment can result in financial loss and reputation damage through fraud, disclosure of customer information, corruption of data, or unenforceable agreements. 
There are a variety of technologies and methodologies financial institutions can use to authenticate customers. These methods include the use of customer passwords, personal identification numbers (PINs), digital certificates using a public key infrastructure (PKI), physical devices such as smart cards, one-time passwords (OTPs), USB plug-ins or other types of “tokens”, transaction profile scripts, biometric identification, and others. The level of risk protection afforded by each of these techniques varies. 
The selection and use of authentication technologies and methods should depend upon the results of the financial institution’s risk assessment process. Authentication methods that depend on more than one factor are more difficult to compromise than single-factor methods. Accordingly, properly designed and implemented multifactor authentication methods are more reliable and stronger fraud deterrents. 
For example, the use of a logon ID/password is single-factor authentication (i.e., something the user knows); whereas, an ATM transaction requires multifactor authentication: something the user possesses (i.e., the card) combined with something the user knows (i.e., PIN). 
The success of a particular authentication method depends on more than the technology. It also depends on appropriate policies, procedures, and controls. An effective authentication method should have customer acceptance, reliable performance, scalability to accommodate growth, and interoperability with existing systems and future plans.

Risk Assessment:
The implementation of appropriate authentication methodologies should start with an assessment of the risk posed by the institution’s Internet banking systems. 
  The risk should be evaluated in light of the type of customer (e.g., retail or commercial), the customer transactional capabilities (e.g., bill payment, wire transfer, loan origination), the sensitivity of customer information being communicated to both the institution and the customer, the ease of using the communication method; and the volume of transactions. Prior agency guidance has elaborated on this risk-based and “layered” approach to information security. 
An effective authentication program should be implemented to ensure that controls and authentication tools are appropriate for all of the financial institution’s Internet-based products and services. 
Authentication processes should be designed to maximize interoperability and should be consistent with the financial institution’s overall strategy for Internet banking and electronic commerce customer services. 
The level of authentication used by a financial institution in a particular application should be appropriate to the level of risk in that application.
The method of authentication used in a specific Internet application should be appropriate and reasonable, from a business perspective, in light of the reasonably forcible risks in that application. 
Because the standards for implementing a commercially reasonable system may change over time as technology and other procedures develop, financial institutions and technology service providers should develop an on going process to review authentication technology and ensure appropriate changes are implemented.
The agencies consider single-factor authentication, as the only control mechanism, to be inadequate for high-risk transactions involving access to customer information or the movement of funds to other parties. Single-factor authentication tools, including passwords and PINs, have been widely used for a variety of Internet banking and electronic commerce activities, including account inquiry, bill payment, and account aggregation. 
However, financial institutions should assess the adequacy of such authentication techniques in light of new or changing risks such as phishing, pharming, malware, and the evolving sophistication of compromise techniques. Where risk assessments indicate that the use of single-factor authentication is inadequate, financial institutions should implement multifactor authentication, layered security, or other controls reasonably calculated to mitigate those risks.
The risk assessment process should: 
Identify all transactions and levels of access associated with                                     
    Internet-based customer products and services.
Identify and assess the risk mitigation techniques, including                
    Authentication methodologies, employed for each transaction type              
    and level of access. 
Include the ability to gauge the effectiveness of risk mitigation      
Techniques for current and changing risk factors for each transaction type and level of access.

Customer Verification:
With the growth in electronic banking and commerce, financial institutions should use reliable methods of originating new customer accounts online. Moreover, customer identity verification during account origination is required and is important in reducing the risk of identity theft, fraudulent account applications, and unenforceable account agreements or transactions. 

Potentially significant risks arise when a financial institution accepts new customers through the Internet or other electronic channels because of the absence of the physical cues that financial institutions traditionally use to identify persons.         

One method to verify a customer’s identity is a physical presentation of a proof of identity credential such as a driver's license. Similarly, to establish the validity of a business and the authority of persons to perform transactions on its behalf, financial institutions typically review articles of incorporation, business credit reports, and board resolutions identifying officers and authorized signers, and other business credentials. 

However, in an Internet banking environment, reliance on these traditional forms of paper-based verification decreases substantially. Accordingly, financial institutions need to use reliable alternative methods or authentication, as the only control mechanism, to be inadequate in the case.

PROBLEM STATEMENT:
Existing System:         
Authentication methodologies are numerous and range from simple to complex. The level of security provided varies based upon both the technique used and the manner in which it is deployed. Single-factor authentication involves the use of one factor to verify customer identity. 

The most common single-factor method is the use of a password. Two-factor authentication is most widely used with ATMs. To withdraw money from an ATM, the customer must present both an ATM card and a password or PIN. Multifactor authentication utilizes two or more factors to verify customer identity. 

Authentication methodologies based upon multiple factors can be more difficult to compromise and should be considered for high-risk situations. The effectiveness of a particular authentication technique is dependent upon the integrity of the selected product or process and the manner in which it is implemented and managed.
Which ever authentication tool is chosen heavily depends on the type of service and across which channel together with a risk assessment that the financial institution must carry out in order to ensure that the perceived risks are adequately mitigated. An effective authentication program should be implemented on an enterprise-wide basis and across all services channels. 
For example internet, telephone and call-centre services, to ensure that controls and authentication tools are adequate. Authentication processes should be designed to maximize interoperability and should be consistent with the financial institution's overall strategy for electronic banking and e-commerce customer services.

Tokens:
Tokens are physical devices (something the person has) and may be part of a multifactor authentication scheme. Three types of tokens are discussed here: the USB token device, the smart card, and the password-generating token. 

a) USB Token Device:
The USB token device is typically the size of a house key. It plugs directly into a computer’s USB port and therefore does not require the installation of any special hardware on the user’s computer. Once the USB token is recognized, the customer is prompted to enter his or her password (the second authenticating factor) in order to gain access to the computer system. 
USB tokens are one-piece, injection-molded devices. USB tokens are hard to duplicate and are tamper resistant; thus, they are a relatively secure vehicle for storing sensitive data and credentials. The device has the ability to store digital certificates that can be used in a public key infrastructure (PKI) environment.
The USB token is generally considered to be user-friendly. Its small size makes it easy for the user to carry and, as noted above, it plugs into an existing USB port; thus the need for additional hardware is eliminated.
By requiring two independent elements for user authentication, this approach significantly decreases the chances of unauthorized information access and fraud.
USB Tokens are designed to securely store an individual’s digital identity (digital ID), specifically their Entrust digital certificates and keys.

These portable tokens plug into a computer’s USB port either directly or using a USB extension cable. When users attempt to login to applications via the desktop, VPN/WLAN or Web portal, they will be prompted to enter their unique PIN number. If the entered PIN number matches the PIN within the Entrust USB Token, the appropriate digital credentials are passed to the network and access is granted. PIN numbers stored on the token are encrypted for added security.

b) Smart Card:
A smart card is a small, tamperproof computer. The smart card itself contains a CPU and some non-volatile storage. In most cards, some of the storage is tamperproof while the rest is accessible to any application that can talk to the card. This capability makes it possible for the card to keep some secrets, such as the private keys associated with any certificates it holds. The card itself actually performs its own cryptographic operations.
Although smart cards are often compared to hard drives, they are “secured drives with a brain”—they store and process information. 
Smart cards are storage devices with the core mechanics to facilitate communication with a reader or coupler which looks like as shown in the fig.1.2. They have file-system configurations and the ability to be partitioned into public and private spaces that can be made available or locked. They also have segregated areas for protected information, such as certificates, e-purses, and entire operating systems. In addition to traditional data storage states, such as read-only and read/write, some vendors are working with sub states best described as “add only” and “update only.”
Smart cards are a key component of the public key infrastructure (PKI) because smart cards enhance software-only solutions, such as client authentication, logon, and secure email. Smart cards are a point of convergence for public key certificates and associated keys because:
1) Provide tamper-resistant storage for protecting private keys and other forms of personal information.
2) Isolate security-critical computations, involving authentication, digital signatures, and key exchange from other parts of the system that don’t have a need to know.
3) Enable portability of credentials and other private information between computers at work, at home, or on the road.

The primary disadvantage as a consumer authentication device is that they require the installation of a hardware reader and associated software drivers on the consumer’s home computer.

Biometrics:
Biometrics are automated methods of identifying a person or verifying the identity of a person based on a physiological or behavioral characteristic. Examples of physiological characteristics include hand or finger images, facial characteristics, and iris recognition. Behavioral characteristics are traits that are learned or acquired. Dynamic signature verification, speaker verification, and keystroke dynamics are examples of behavioral characteristics.

Biometric authentication requires comparing a registered or enrolled biometric sample (biometric template or identifier) against a newly captured biometric sample (for example, a fingerprint captured during a login). During Enrollment, as shown in the picture below, a sample of the biometric trait is captured, processed by a computer, and stored for later comparison.

Biometric recognition can be used in Identification mode, where the biometric system identifies a person from the entire enrolled population by searching a database for a match based solely on the biometric. 

For example, an entire database can be searched to verify a person has not applied for entitlement benefits under two different names. This is sometimes called “one-to-many” matching. A system can also be used in Verification mode, where the biometric system authenticates a person’s claimed identity from their previously enrolled pattern. This is also called “one-to-one” matching. 

In most computer access or network access environments, verification mode would be used. A user enters an account, user name, or inserts a token such as a smart card, but instead of entering a password, a simple touch with a finger or a glance at a camera is enough to authenticate the user.
Biometric techniques:

Various biometric techniques and identifiers are being developed and tested, these include: 
Fingerprint recognition.
Iris scan. 

a) Fingerprint Recognition:
Fingerprint recognition technologies analyze global pattern schemata on the fingerprint, along with small unique marks known as minutiae, which are the ridge endings and bifurcations or branches in the fingerprint ridges. The data extracted from fingerprints are extremely dense and the density explains why fingerprints are a very reliable means of identification. 

Fingerprint recognition systems store only data describing the exact fingerprint minutiae; images of actual fingerprints are not retained. Fingerprint scanners may be built into computer keyboards or pointing devices (mice), or may be stand-alone scanning devices attached to a computer. 

Fingerprints are unique and complex enough to provide a robust template for authentication as shown in the fig.1.3. Using multiple fingerprints from the same individual affords a greater degree of accuracy. Fingerprint identification technologies are among the most mature and accurate of the various biometric methods of identification. 
Although end users should have little trouble using a fingerprint-scanning device, special hardware and software must be installed on the user’s computer. Fingerprint recognition implementation will vary according to the vendor and the degree of sophistication required. This technology is not portable since a scanning device needs to be installed on each participating user’s computer. However, fingerprint biometrics is generally considered easier to install and use than other, more complex technologies, such as iris scanning. 

Enrollment can be performed either at the financial institution’s customer service center or remotely by the customer after he or she has received setup instructions and passwords. According to fingerprint technology vendors, there are several scenarios for remote enrollment that provide adequate security, but for large-dollar transaction accounts, the institution should consider requiring that customers appear in person

b) Iris Scan:
Iris recognition is a method of biometric authentication that uses pattern recognition techniques based on high-resolution images of the irides of an individual's eyes. Iris recognition uses camera technology, and subtle IR illumination to reduce specular reflection from the convex cornea to create images of the detail-rich, intricate structures of the iris. These unique structures converted into digital templates, provide mathematical representations of the iris that yield unambiguous positive identification of an individual.
Iris recognition efficacy is rarely impeded by glasses or contact lenses as shown in the fig.1.4. Iris technology has the smallest outlier (those who cannot use/enroll) group of all biometric technologies. The only biometric authentication technology designed for use in a one-to many search environment, a key advantage of iris recognition is its stability, or template longevity as, barring trauma, a single enrollment can last a lifetime.
The iris of the eye has been described as the ideal part of the human body for biometric identification for several reasons:
It is an internal organ that is well protected against damage and wear by a highly transparent and sensitive membrane (the cornea). This distinguishes it from fingerprints, which can be difficult to recognize after years of certain types of manual labor.
The iris is mostly flat and its geometric configuration is only controlled by two complementary muscles (the sphincter pupillae and dilator pupillae), which control the diameter of the pupil. This makes the iris shape far more predictable than, for instance, that of the face.
The iris has a fine texture that – like fingerprints – is determined randomly during embryonic gestation. Even genetically identical individuals have completely independent iris textures, whereas DNA (genetic "fingerprinting") is not unique for the about 1.5% of the human population who have a genetically identical monozygotic twin.

SYSTEM ANALYSIS
1. PROPOSAL AND IMPLEMENTATION OF SECURE AUTHENTICATION
           This proposal deals  with TWO FACTOR AUTHENTICATION TECHNIQUE, as many organizations still only rely on static ID and password that is been used in the existing Single Factor Authentication system that provides the simplest form of authentication that may not be sufficient to safeguard against unauthorized access. Therefore the rapid spread of e-Business has necessitated for securing transactions and therefore financial organizations are looking for two-factor authentication technique as a fundamental security function. 

This SECURE AUTHENTICATION SYSTEM implements a two-factor authentication process based on the generation of One Time Password. It provides a useful authentication mechanism for situations where there is limited client or server trust. Here both the client and the server share a common shared secret and a cryptographic algorithm these are then used to generate an OTP at both the ends. If the server validates the two OTP’s to be the same the authentication is successful.
Here the OTP at any instant generated at either end depends upon the previous OTP that was generated there by making authentication even stronger. 

2.EXISTING USER AUTHENTICATION TECHNIQUES:

SINGLE FACTOR AUTHENTICATION - Defined
Single factor authentication has been traditionally established by one of these elements:
Something you have - including keys or token cards
Something you know - including passwords
Something you are - including fingerprints, voiceprints or retinal scans (iris)
In Single factor authentication solution the user has to prove knowledge or possession of something it does not require additional authentication 
Passwords are the most basic and the most common method of single factor authentication is shown. 

SINGLE FACTOR AUTHENTICATION - 
Drawbacks:
Individually, any one of these approaches has its limitations.
“Something you have ” can be stolen, while “Something you know”
Can be guessed, shared or lost to other methods. “Something you are”
The single-factor attacks include keystroke monitoring, social engineering, man-in-the-middle attacks, network monitoring, password cracking, IT staff abuse, server compromise, and so on.

2.2 TWO FACTOR AUTHENTICATION - Defined

Given the limitations of single - factor authentication, the logical alternative, is two factor authentication, in which two of the methods are applied in tandem a combination of knowledge and possession i.e. something the entity knows and something the entity possess. A perfect example is the system employed to authenticate automated teller machine (ATM) users, which blends a magnetic-strip card (what you have) with a multi-digit PIN (what you know).

          Anyone type of authentication may authorize access, but using two types moves toward the control concept of non-repudiation; not only can you prove your identity and gain access to a resource, but you cannot deny accessing the resource at a later time. We define “Stronger user authentication” as the Two-factor method described below.

NEED FOR STRONG AUTHENTICATION
There are three essential reasons why an organization may decide to use strong authentication:
1. The cost associated with loss of unauthorized data is usually the most compelling reason to use strong authentication.  Strong authentication should be used in the case of high-risk data while it may not pay to use strong authentication for low risk data.

2.   A corporation could be held liable for an attack by a hacker. The loss of money 
  and public confidence in this scenario will be great. Use of strong authentication
greatly minimizes the risk

1. The authentication tool should be capable of evolving as technology and threat changes. Therefore, in investing in a strong authentication tool is essential to acquire one that can change as technology advances. Strong two factor authentication contains many sub-groups.

2. One Time Passwords
           Most of the vulnerabilities of fixed passwords (stealing--as the identity thieves did, sniffing, guessing, hacking, etc.) would be eliminated if users constantly change their passwords--not every 90 days, but every time they log in.
         Organizations require strong authentication with one-time passwords for employees and business partners who need to access confidential information to provide an important level of protection for their clients'.
          It is important to note that each use of the OTP mechanism causes the authentication database entry for a user to be updated.

         Due to their relative ease of use and familiar end-user paradigm, OTP-based solutions are the most widely deployed by enterprises today. In addition to remote access solutions, more and more enterprises have been adopting strong authentication solutions to secure their critical commerce and communication applications including intranets, extranets, and e-commerce Web applications.          


             A One Time Password (OTP) is, as the name indicates, a password that can only be used once. The basics are that there is a server and a client. There are then two main options how the OTP is generated, distributed and validated.

           The first option which is shown in the fig.2.3 is that the server and the client share a common shared secret and a cryptographic algorithm and these are then used to generate a OTP at both ends. If the server validates the two OTP’s to be the same the authentication is successful.

         The second option which is shown in the fig.2.4is that the server generates the OTP and distributes it to the client in a secure manner. The client then submits the OTP back to the server and if the server validates the two OTP’s to be the same the authentication is successful.

         In Option 1 we require software at both the ends client as well as the server side, but there is a possibility that the software at the client side be stolen and be used by others.
         To overcome the above problem we go in for Option 2 where the software is only at the server side but here the limitation is that the client or the end-user here uses   a mobile phone or any registered PDA for sending or receiving SMS regarding One Time Password thus requiring an additional third party i.e. Service Provider which looks after the SMS.

4. Proposed Two Factor Technique (Modification of Option 1)

Here both the client and the server share a common shared secret and a cryptographic algorithm these are then used to generate an OTP at both the ends. If the server validates the two OTP’s to be the same the authentication is successful.
          Here the OTP at any instant generated at either end depends upon the previous OTP that was generated, there by making authentication even stronger thus preventing it from any attack.

Objectives:
      The objectives of this standard are to: 
a. Improve the administration of password systems that are used for authenticating the identity of individuals accessing computer resources or files. 
b. Provide a standard automated method for producing passwords for a user depending upon the previous password generated.
c. Produce passwords that are easily stored, and entered into computer systems, yet not readily susceptible to automated techniques that have been developed to search for and disclose passwords. 

5. Authentication Mechanism  
· The client and the server are connected through the Web.
· The interaction between the two can be modeled as a sequence of two sessions with a prior phase of initialization.

Initialization:
           In this phase the client registers with the server and is assigned a profile including a set of credentials userid/pwd along with an additional credential a num used as the basis for the two-factor authentication mechanism. This num is automatically generated for the first time and assigned to the user’s profile and stored in a database system hosted by the server. At the client side also it is securely stored. This num initially forms an input to the cryptographic algorithm that is used to generate the One Time Password. There after the previous password that is stored in the database is given as the input to the algorithm.

Session-1:
           The client establishes a channel with the bank through the Internet by presenting the login credentials userid/pwd. The actual two-factor authentication is implemented by the generation of One Time Password at the client side and the encrypted form is been send to the server, simultaneously the server also generates the One Time Password.

Session-2:
a. At the server the One Time Password of the client is decrypted and the two passwords of the client and the server are compared if same authentication is successful and the client is allowed to access data or perform any transaction.

b. The advantage of this authentication mechanism is that even though the software at the client side is got others cannot use it as the One Time Password at any instant depends upon the previous password that is securely stored on the disk.

c. Even if the OTP was stolen while transit it does no good to the attacker as it keeps changing and the OTP response will never be valid twice this number is more than 11 digits so difficult to decrypt.

Features:

The algorithm at the client side generates the One Time Password and encrypts.
At the server side generates One Time Password, decrypts One Time Password of the client and compares them.
The standard used for encryption or decryption can be similar to that of RSA or MD5.
Input to the Alg. Form any secured storage device 

 SOFTWARE REQUIREMENTS:
Application Language   : JAVA
Operating System         : Windows
Protocols                     :  HTTP
Web Server                  : Tomcat

HARDWARE REQUIREMENTS:
Personal computer
Processor  :  Intel Pentium 4 Processor
Hard disk  :  80 GB
RAM       :  512 MB

SYSTEM DESIGN AND IMPLEMENTATION

Server Level

        The current model, which uses username and password pairs, is a single factor authentication mechanism. The main issues with passwords are that, if a strong model is applied, it becomes difficult to remember multiple passwords for different servers. Strong passwords are defined in [1], under Appendix A as the "University minimum password standard". System administrators and other staff that need to access servers (Oracle DBAs for example) frequently have to deal with a high number of passwords for different servers. 

         There is a requirement for a stronger means of authentication, at least initially for the most sensitive servers/machines. Strong authentication is based upon the requirement for the user to present credentials which are based on multiple factors (two or more). For the University of Auckland, it was decided that the two factor authentication provided sufficient control. 
These factors include: 
Something only the user knows (his password or PIN). 
Something only the user has. This is usually a physical device - a OTP was used during the evaluation project. 

          The principle behind this system for authentication is simple. The OTP generates a one time password (OTP), which is a 6 digit number. This number is cryptographically generated and is dependant on the initial seed and the current time. The server which has an authentication agent installed, which in turn passes the authentication request to the authentication (secure) server, as is shown in the fig 4.3. 

The user connects to the server which is protected with the two factor authentication product. This can be a simple service which authenticates the user through the web interface or a VPN device which is supported by the two factor authentication product (Chapter 2 enumerates all the requirements for the products which were evaluated). The user is prompted to enter his OTP, displayed by the OTP into the application. 

The server passes the authentication request to the secure authentication server. This request is also encrypted and communication is through well known ports/services which enables high level of security for the secure authentication server. Our plan is to put it behind the firewall and to disable everything except the traffic which is needed. 
The secure authentication server verifies the authentication request. It  requires the username and the OTP, both of which are passed from the requesting server. In the database, the username has an assigned OTP every OTP has a unique identification number which enables the server to know what seed it needs to use to compute the OTP. Once the OTP is computed, it can be compared with the one submitted and authentication can be approved or rejected (if the OTP is incorrect). 
The server successfully authenticates the user. 

There are a couple of things related to the operation of this system and OTP OTPs in general: 

         The OTP is dependent on the correct time synchronization with the server. As most of the OTPs are "dummy" devices which can't be configured (an exception is CRYPTO Card which has an initializer device), the server can accommodate slight time errors because these devices can become desynchronized over time. The server will allow an entered OTP if the time difference is within some predefined interval. As the server knows all the OTPs that can be entered at certain time (it knows the seed of the OTP so it can infinitely compute OTPs), during the authentication process (step 3), it will compute the previous and the next OTP as well. If the computation shows that the OTP has a time delay of a certain amount (0.05 seconds), the server will allow this OTP and register the delay in the database. Next time when this OTP tries to authenticate the server will know what the expected delay is of course, if the OTP is outside of a preconfigured interval, the server will reject the authentication request and log it as an unsuccessful attempt. 

          In order to increase the security level of this mechanism, e.g. when a OTP is lost or stolen, all evaluated products can implement additional PIN requests. This means that, besides the OTP that the OTP generates, a user has to know the PIN as well. The PIN is a 4-6 digit number which has to be entered before the OTP, which are concatenated. 

          If even more security is needed, all evaluated systems permitted additional authentication by using the normal system password; i.e. after successful authentication with the OTP, the user also has to enter their standard password. 

         The native PAM module integration is the preferred method as authentication through the radius module introduces additional dependencies to the system. The radius module wasn't tested during the evaluation period as it substantially complicates the setup.

Server with FIP180-1 standard OTP:
          FIP180-1 STANDARD is one of the oldest companies providing OTPs and other hardware for multi factor authentication. 

         The tested environment consisted of one FIP180-1 STANDARD OTP generate Server (authentication server). The server was installed on a machine running Microsoft Windows 2003 operating system. The whole process of installation and initial deployment requires various network topologies. 

The following features would be found 
Supports a large number of local and remote clients, which can be integrated so they can use a strong means of authentication. Supports software OTPs.

Software OTPs can be installed on any machine and are protected with the user’s password. The security level of software OTPs is less than of hardware ones, as various attacks are now possible on the user, such as installation of key logging software which can record the users password. However, even in this scenario, the attacker has to have access to the exact software OTP which is installed on the users' machine, because only that OTP has the correct seed which enables it to generate valid OTPs. In cases when this risk is acceptable, OTPs are a cheap and easy way of improving security. 

Logs generated by the authentication server would be satisfactory. Logging could be better but it does provide enough information. It is being considered that, if this solution is implemented, logs are processed regularly to spot any anomalies. The product can also be configured so it stores logs in its own log files, or to use the System Log in Windows operating systems. 

The authentication agent is easy to install and stable. It changes the normal login screen so the user will be asked to provide whatever information is configured on the server (usually a PIN or user id and password that uses to get the OTP). Integration would be very easy and straightforward. 

Server would support to use multiple systems to additionally authenticate users. Local authentication can be used, domain controllers.

The administrator privileges would allow to configure easily so that the password is needed as well, which raises the security level. 

Server can be installed in a completely redundant mode which 
prevents failure of the authentication system in event of a disaster. 

Client level
Registration:

User requests authentication by using secrete pin or user id and passw ord . One time registration process would go with the available user interface Registers his identification by giving official details provided by that organization. Also registers his personal gprs device like mobile or pda. 

User Access:
User requests authentication by using secrete pin or user id and password. Server authenticates the user validity, if valid then it generates the OTP OTP will be sent to user’s registered GPRS device via SMS service So that user can authenticate finally by using that OTP. User can get secured    authentication.

Proposed Quality Standards:
OTP will be generated by SHA-1 algorithm.
OTP should meet the FIPS 180-1  standards.
Life time of the OTP  is 5 minutes by default.

INTRODUCTION TO UML DIAGRAM (Unified Modeling Language)

The unified Modeling Language (UML)is a standard language for writing software blue prints. The UML may be used to visualize, specify, construct, and document the artifacts of software –intensive system.

UML DIAGRAMS

Use Case Diagrams:
  Use Case are used during requirements elicitation and analysis to represent the functionality of the system by focusing the behavior of the system from an external print of view and yield a visible result for an actor.
       Actor is any entity that interacts with the system
Use case Notations:
 A Use case desired by a template composed of six fields.
Name: it is unique across the system having no ambiguity to developers.
Participating actors: the actors interacting with the use case
Entry conditions: Describes the conditions that need to be satisfied before the use case is initiated.
Flow of events: describes the sequence of actions of the use case
Exit condition:  describes the conditions that need to be satisfied after the completion of the use case
Special requirements: requirements that are not related to the function of the system which include constraints on performance of the system, its implementations and so on.

TESTING
Testing is conducted to uncover the errors and ensure that defined input will produce actual results that agree with required results. Once code has been generated program testing begins. The testing process focuses on the logical internals of the software, ensuring all the statements have been tested.
Testing Objectives:
Testing is a process of executing a program with the intent of finding an error.
A good test case is one that has a high probability of finding an as yet undiscovered error.
A successful test is one that uncovers an as yet undiscovered error.

Any work can be completed, but completion should give satisfaction. In order to make ourselves and end user give a touch of satisfaction, we performed a series of testing process, which rectified mistakes during coding. This made our project highly reliable and efficient.
This document provides structure to the testing. It describes which artifacts will be tested. Without this document, the testing would be haphazard, in which case the team could have little confidence in the product it will deliver.
In our project the main test cases are used in order to validate the user who wants to access the services present in the portal. In order to access the services present in the portal the user need to register & then login through the site. If the given details are valid then the user can access the services

TEST CASES:
There are two main test cases in our project they are 
Validating the static user id and password 
Validating the OTP.

Test Case 1:
Validating Static User Id & Password 
If the given user id & password of the user doesn’t match with user id and password stored in the database then the user cannot login. So the user should provide the correct user id & password in order to login.

Test Case 2:
Validating the OTP
If the OTP obtained after logging in the system with the static user id and password is invalid then the user cannot access the services present in the portal. User can access the services present in the portal if he/she enters the valid OTP.

CONCLUSION
Application proposes a secure, convenient and user friendly two-factor authentication scheme. People may easily forget passwords but are less prone to misplace their personal smart phones. It is more convenience for the bank. The user is simply required personal electronic devices like mobile to enforce authentication based on weak credentials (userid/password). It is very efficient for secure financial transactions.

The benefits of the two factor authentication scheme can be summarized as follows: 
No specialized hardware needed (OTP generators) by the user and the financial service provider (bank) 
More convenience for the bank that can rely on a device owned and maintained by the customer
Strong security deriving from usage of well known cryptographic primitives as building blocks
No need to setup costly / unreliable connectionless services (as in SMS based authentication mechanisms) or connection-oriented links; 
Possibility of using a single device to authenticate to multiple service providers.

No comments:

Post a Comment

leave your opinion