Wednesday, 25 December 2013

Ergonomics in HCI.

Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance (International Ergonomics Association).



Ergonomics (or human factors) is traditionally the study of the physical characteristics of the interaction: how the controls are designed, the physical environment in which the interaction takes place, and the layout and physical qualities of the screen.

Ergonomics is employed to fulfil the two goals of health and productivity. It is relevant in the design of such things as safe furniture and easy-to-use interfaces to machines and equipment. Proper ergonomic design is necessary to prevent repetitive strain injuries, which can develop over time and can lead to long-term disability. Ergonomics is concerned with the ‘fit’ between computers and their technological robots and environments.

             Practitioners of ergonomics, contribute to the planning, design and evaluation of tasks, jobs, products, organizations, environments and systems in order to make them compatible with the needs, abilities and limitations of people. 

User Experience in HCI

User experience (abbreviated as UX) is how a person feels when interfacing with a system, what he expect of your interface and system. The system could be a website, a web application or desktop software and, in modern contexts, is generally denoted by some form of human-computer interaction (HCI).

User experience (UX) is about how a person feels about using a product, system or service. User experience highlights the experiential, affective, meaningful and valuable aspects of human-computer interaction and product ownership, but it also includes a person’s perceptions of the practical aspects such as utility, ease of use and efficiency of the system. User experience is subjective in nature, because it is about an individual’s feelings and thoughts about the system. User experience is dynamic, because it changes over time as the circumstances change.

·         all the aspects of how people use an interactive product: the way it feels in their hands, how well they understand how it works, how they feel about it while they’re using it, how well it serves their purposes, and how well it fits into the entire context in which they are using it.
·         User Experience (abbreviated: UX) is the quality of experience a person has when interacting with a specific design.
·         The user experience is the totality of end-users’ perceptions as they interact with a product or service. These perceptions include effectiveness (how good is the result?), efficiency (how fast or cheap is it?), emotional satisfaction (how good does it feel?), and the quality of the relationship with the entity that created the product or service (what expectations does it create for subsequent interactions?


 User Experience: Focuses on creating systems that are satisfying, enjoyable, entertaining, helpful, motivating, aesthetically pleasing, creativity supportive, rewarding, fun and emotionally fulfilling.


Low fidelity Prototype and High fidelity Prototype.


Fidelity:

refers to the level of detail, accuracy or coverage of a prototype. It can relate to functionality but most people use the term in relation to visual appearance.


A prototype is a working model built to develop and test design ideas. In web and software interface design, prototypes can be used to examine content, aesthetics, and interaction techniques from the perspectives of designers, clients, and users.
The main purpose of prototyping is to involve the users in testing design ideas and get their feedback in the early stage of development, thus to reduce the time and cost. It provides an efficient and effective way to refine and optimize interfaces through discussion, exploration, testing and iterative revision.

Low-fidelity prototypes

Low-fidelity prototyping is mainly about paper-based mock-up.
Low-fidelity prototypes are very quick hand sketches while the highest are fully detailed, pixel perfect renditions.They are built mostly to  design alternatives or screen layouts. Typical examples of low-fidelity prototypes include storyboards, drawings, paper mockups etc.
Low-fidelity prototypes are quickly constructed to design alternatives, and screen layouts, rather than to model the user interaction with a system

Examples
 Paper Prototyping•
 Balsamiq Mockups & iMockups

Advantages of Low Fidelity:
1. Less time and lower cost.
2. Evaluate multiple design concepts.
3. Useful communication device.
4. Address screen layout issues.


Disadvantages of Low Fidelity:
1. Limited usefulness for usability testing
2. Navigational and flow limitations.
3. Facilitator divisions.
4. Poor detailed specification.



 High fidelity prototypes

High-fidelity is mainly about computer-based simulation.
High fidelity prototypes offer more realistic interactions and are better at conveying the range of design possibilities. High fidelity prototyping, however, may make designers reluctant to change designs and less likely to fully explore the design space.
High fidelity prototypes are typically fully interactive, represent’s the product’s core
functionality and are often built with prototyping systems. They are used mostly for exploration and tests of the look and feel of the final product. A high fidelity prototype helps keep the focus of the team on the user experience.
high-fidelity prototypes are fully interactive, simulating much of the functionality in the final product.  Users can operate on the prototype, or even perform some real tasks with it.
High-Fidelity Prototyping
• Axure Pro 6
• Microsoft Expression BlendUI Prototyping Tool

Advantages of High Fidelity:
1. Partial/ complete functionality interactive.
2. User driven.
3. Clearly defines navigational schemas.
4. Use for exploration and test.
5. Marketing and sales tool.

Disadvantages of High Fidelity:
1. Time consuming to create
2. Inefficient for proof-of-concept designs.
3. Managements may think it is real.

Describe the role of HCI with Advantages and Disadvantages.

"Human-computer interaction (HCI) is the study of interaction between people (users) and computers. Interaction between users and computers occurs at the user interface (or simply interface), which includes both hardware (i.e. peripherals and other hardware) and software (for example determining which, and how, information is presented to the user on a screen)."

“Human-Computer  Interaction  is  a  discipline  concerned  with  the  design,  evaluation and  implementation  of  interactive  computing  systems  for  human  use .
Human-Computer Interaction (HCI) research is performed to provide and promote a scientific understanding of the interaction between humans and the computer technology and tools that we use. 
A basic goal of HCI is to improve the interactions between users and computers by making computers more usable and receptive to the user's needs.

There are 4 types of User Interfaces:

Command Line Interface (CLI)Menu Driven InterfaceGraphical User Interface (GUI)Natural Language Interface


         i.            Command Line Interface (CLI)

A CLI displays a prompt; the user types a command on the keyboard and executes the command. The computer executes the command, providing textual output.
Advantages
•          Very flexible with the use of “switches” (options)
•          Good for “expert” users - can quickly access commands
•          Uses the fewest  system resources
Disadvantages
•          Requires the user to learn “complex” commands or language
•          “Hidden” features i.e. if you don’t know the commands you won’t know the features are there!
•          Not very good for novice users
Command Line Interface Applications
•          System administration
•          Engineering applications
•          Scientific applications
•          Ideal for visually impaired users!!!

       ii.            Menu Driven Interface

The user has a list of items to choose from, and can make selections by highlighting one.
Advantages
•          No need to learn complex commands/language
•          Easier for a novice to learn/use
•          Ideal when there are a limited number of  options (efficient)
Disadvantages
•          Can be frustrating for experienced users i.e. the command they want to use is buried 5 levels deep!!!!

•          User interface may be limited by screen space and number of options available
Menu Driven Applications
•          ATM
•          Mobile Phone
•          MP3 Player
•          Video recorder
•          Household Devices
•          Digital/Cable TV

                    iii.            Graphical User Interface (GUI)

Uses windows, icons, menus and pointers (WIMP) which can be manipulated by a mouse (and often to an extent by a keyboard as well).
Most suitable interface for inexperienced or novice users but…
UIs use more system resources than other types of interface
 Many generic packages for a GUI will share common features
•          Layout of the screen
•          Names given to commands
•          Icons                    
•          Order of menus
•          Mouse operation
•          Dialog boxes

Benefits of a common interface
There are five advantages to the ‘common user interface’:
•          Increased speed of learning
•          Ease of use
•          Confidence for novice users
•          Increase the range of solvable tasks by users
•          Greater range of software available to the average computer user

       iv.            Natural Language Interface

Can range from simple command systems to voice activated text processing. Commands are spoken in “normal” language.
Advantages
•          No training required – you just tell the computer what you want to do!
•          Can be quicker than keyboard entry
•          Hands-free – could be invaluable in some environments
•          Can be used by the disabled
Disadvantages
•          Emerging technology – still contains “bugs”
•          Difficulty dealing with homonyms
•          Difficult to recognise all the different ways of saying things (and regional dialects)
•          Artificial languages are often more precise

Sunday, 24 November 2013

What is Difference Between OLAP and OLTP ?


OLTP

 
Online transaction processing takes care of the day-to-day transaction related data.
The data keeps changing every day. The focus for OLTP is to be able to access every single record. Replicating the data is very difficult as OLTP has a very complex data model compared to OLAP.
The main emphasis for OLTP system is put on very fast query processing, maintaining data integrity in multi access environment and an effectiveness measured by number of transactions per second.
 
Example: In a banking System, you withdraw amount from your account. Then Account Number, Withdrawal amount, Available Amount, Balance Amount, Transaction Number etc are operational data elements.
 

OLAP

 

Online analytical processing stores historic data which is useful for analysis and reporting. It contains snapshot and event (fact) tables.   Replication of data is easy here as the data model is simplified.

 
Example: If we collect last 10 years data about flight reservation, The data can give us many meaningful information such as the trends in reservation. This may give useful information like peak time of travel, what kinds of people are traveling in various classes (Economy/Business)etc.

 


I have listed below some of the basic differences between OLTP and OLAP:

Points
 OLTP
OLAP
Speed
Fast
Depends on the data
Storage requirements
Less if old data is archived
Large
Data Source
Operational Data
From OLTP databases
Normalization
Highly normalized
Less normalized
Queries
One tuple at a time
Aggregate queries
Query type
Insert, update, delete
Select

 




 
1)      OLAP can be designed using the data from OLTP by using ETL processes

      2)      Data marts or data warehouse can be maintained as OLAP systems

      3)      OLTP is used for business reporting

 

Friday, 18 October 2013

Data Warehouse Architectures

Data warehouses and their architectures vary depending upon the specifics of an organization's situation.

Data Warehouse Architecture (Basic)

shows a simple architecture for a data warehouse. End users directly access data derived from several source systems through the data warehouse.

Architecture of a Data Warehouse

Text description of dwhsg013.gif follows

This illustrates three things:

  • Data Sources (operational systems and flat files)
  • Warehouse (metadata, summary data, and raw data)
  • Users (analysis, reporting, and mining)
The metadata and raw data of a traditional OLTP system is present, as is an additional type of data, summary data. Summaries are very valuable in data warehouses because they pre-compute long operations in advance. For example, a typical data warehouse query is to retrieve something like August sales. A summary in Oracle is called a materialized view.



Data Warehouse Architecture (with a Staging Area)


You need to clean and process your operational data before putting it into the warehouse. You can do this programmatically although most data warehouses use a Staging area instead. A staging area simplifies building summaries and general warehouse management.

Architecture of a Data Warehouse with a Staging Area

Text description of dwhsg015.gif follows
This illustrates four things:
  • Data Sources (operational systems and flat files)
  • Staging Area (where data sources go before the warehouse)
  • Warehouse (metadata, summary data, and raw data)
  • Users (analysis, reporting, and mining)


Data Warehouse Architecture (with a Staging Area and Data Marts)


Although the architecture in is quite common, you may want to customize your warehouse's architecture for different groups within your organization. You can do this by adding data marts, which are systems designed for a particular line of business.Illustrates an example where purchasing, sales, and inventories are separated. In this example, a financial analyst might want to analyze historical data for purchases and sales.

 Architecture of a Data Warehouse with a Staging Area and Data Marts









Text description of dwhsg064.gif follows 


This illustrates five things:
  • Data Sources (operational systems and flat files)
  • Staging Area (where data sources go before the warehouse)
  • Warehouse (metadata, summary data, and raw data)
  • Data Marts (purchasing, sales, and inventory)
  • Users (analysis, reporting, and mining)


Data Warehousing Concepts

The original concept of a data warehouse was devised by IBM as the ‘information ware house and presented as a solution for accessing data held in non-relational systems.

The abbreviated of DW is Data warehouse  is a relational database that is designed for query and analysis rather than for transaction processing is collection of data designed to support Management decision making.
 
A single, complete and consistent store of data obtained from a variety of different sources made available to end users in a what they can understand and use in a business context.

It usually contains historical data derived from transaction data, but it can include data from other sources. It separates analysis workload from transaction workload and enables an organization to consolidate data from several sources.

Development of a data warehouse includes development of systems to extract data from operating systems plus installation of a warehouse database system that provides managers flexible access to the data. 

Data warehouse: A subject-oriented, integrated, time-variant, and non-volatile collection warehousing tion of data in support of management’s decision-making process.

Subject-oriented: Integration is closely related to subject orientation. Data warehouses must put data from disparate sources into a consistent format (such as customers, products, and sales) rather than the major application areas (such as customer invoicing, stock control, and product sales)

Integrated : Integration is closely related to subject orientation. Data warehouses must put data from disparate sources into a consistent format. They must resolve such problems as naming conflicts and inconsistencies among units of measure. When they achieve this, they are said to be integrated.
The source data is often inconsistent using, for example (having different formats) because of different source data from different applications systems.
 
Time-variant: The data in the warehouse is only accurate and valid at some point in time or over some time interval

Non-Volatile: The data is not updated in real time but is refreshed from operational systems on a regular basis. New data is always added as a supplement to the database, rather than a replacement




Monday, 7 October 2013

Spiral Model

The spiral model is similar to the incremental model, with more emphasis placed on risk analysis. The spiral model has four phases: Planning, Risk Analysis, Engineering and Evaluation.
The spiral model combines the idea of iterative development with the systematic, controlled aspects of the waterfall model.
Spiral model is a combination of iterative development process model and sequential linear development model i.e. waterfall model with very high emphasis on risk analysis.
A software project repeatedly passes through these phases in iterations (called Spirals in this model). The baseline spiral, starting in the planning phase, requirements are gathered and risk is assessed. Each subsequent spirals builds on the baseline spiral. 
  • High amount of risk analysis hence, avoidance of Risk is enhanced.
  • Good for large and mission-critical projects.
  • Strong approval and documentation control.
Software is produced in the engineering phase, along with testing at the end of the phase.  The evaluation phase allows the customer to evaluate the output of the project to date before the project continues to the next spiral.
Diagram of Spiral model:
Spiral model

Incremental Model

In incremental model the whole requirement is divided into various builds. Multiple development cycles take place here, making the life cycle a“multi-waterfall” cycle.  Cycles are divided up into smaller, more easily managed modules.  Each module passes through the requirements,


Incremental model is an evolution of waterfall model. Multiple development cycles take place here, making the life cycle a “multi-waterfall” cycle.  Cycles are divided up into smaller, more easily managed iterations.  Each iteration passes through the requirements, design, implementation and testing phases.

§  More flexible – less costly to change scope and requirements.

§  Easier to test and debug during a smaller iteration.
Diagram of Incremental model:
Incremental lifecycle model in software testing

V-Model

V- model means Verification and Validation model. Just like the waterfall model, the V-Shaped life cycle is a sequential path of execution of processes. Each phase must be completed before the next phase begins.  Testing of the product is planned in parallel with a corresponding phase of development.

V - Model is an extension of the waterfall model and is based on association of a testing phase for each corresponding development stage. This means that for every single phase in the development cycle there is a directly associated testing phase. This is a highly disciplined model and next phase starts only after completion of the previous phase.

Diagram of V-model:

SDLC V-Model


Verification Phases

Following are the Verification phases in V-Model:
  • Business Requirement Analysis: This is the first phase in the development cycle where the product requirements are understood from the customer perspective. This phase involves detailed communication with the customer to understand his expectations and exact requirement. 
  • System Design: Once you have the clear and detailed product requirements, it.s time to design the complete system. System design would comprise of understanding and detailing the complete hardware and communication setup for the product under development. System test plan is developed based on the system design. Doing this at an earlier stage leaves more time for actual test execution later.
  • Architectural Design: Architectural specifications are understood and designed in this phase. Usually more than one technical approach is proposed and based on the technical and financial feasibility the final decision is taken. System design is broken down further into modules taking up different functionality. 
  • The data transfer and communication between the internal modules and with the outside world (other systems) is clearly understood and defined in this stage. With this information, integration tests can be designed and documented during this stage.
  • Module Design:In this phase the detailed internal design for all the system modules is specified, referred to as Low Level Design (LLD). It is important that the design is compatible with the other modules in the system architecture and the other external systems. Unit tests are an essential part of any development process and helps eliminate the maximum faults and errors at a very early stage. Unit tests can be designed at this stage based on the internal module designs.

Validation Phases
Following are the Validation phases in V-Model:
  • Unit Testing: Unit tests designed in the module design phase are executed on the code during this validation phase. Unit testing is the testing at code level and helps eliminate bugs at an early stage, though all defects cannot be uncovered by unit testing.
  • Integration Testing: Integration testing is associated with the architectural design phase. Integration tests are performed to test the coexistence and communication of the internal modules within the system.
  • System Testing: System testing is directly associated with the System design phase. System tests check the entire system functionality and the communication of the system under development with external systems. Most of the software and hardware compatibility issues can be uncovered during system test execution.
  • Acceptance Testing: Acceptance testing is associated with the business requirement analysis phase and involves testing the product in user environment. Acceptance tests uncover the compatibility issues with the other systems available in the user environment. It also discovers the non functional issues such as load and performance defects in the actual user environment.

Waterfall Model

The Waterfall Model was first Process Model to be introduced. It is also referred to as a linear-sequential life cycle model. It is very simple to understand and use. In a waterfall model, each phase must be completed fully before the next phase can begin.   At the end of each phase, a review takes place to determine if the project is on the right path and whether or not to continue or discard the project. In waterfall model phases do not overlap.
Waterfall model is the earliest SDLC approach that was used for software development .

The sequential phases in Waterfall model are:

  • Requirement Gathering and analysis: All possible requirements of the system to be developed are captured in this phase and documented in a requirement specification doc.
  • System Design: The requirement specifications from first phase are studied in this phase and system design is prepared. System Design helps in specifying hardware and system requirements and also helps in defining overall system architecture.
  • Implementation: With inputs from system design, the system is first developed in small programs called units, which are integrated in the next phase. Each unit is developed and tested for its functionality which is referred to as Unit Testing.
  • Integration and Testing: All the units developed in the implementation phase are integrated into a system after testing of each unit. Post integration the entire system is tested for any faults and failures.
  • Deployment of system: Once the functional and non functional testing is done, the product is deployed in the customer environment or released into the market.
  • Maintenance: There are some issues which come up in the client environment. To fix those issues patches are released. Also to enhance the product some better versions are released. Maintenance is done to deliver these changes in the customer environment.

Featured post

10 Best Ways to Earn Money from Facebook

10 Best Ways to Earn Money from Facebook Facebook is a household name all over the world. The social networking platform has more than...