Tuesday, 4 August 2015



Knowledge is defined as a link for people who make his/her mind between the application and information in a confined place. Knowledge occurs by the value that is added to the organization by categorizing, contextualizing and calculating the data. Information is also sometimes termed as knowledge when it is processed in the minds of people. This knowledge is again turned to information when it is communicated to others in the form of text, writing or words.

Characteristics of Knowledge Transfer

 There are many characteristics that affect the transfer of knowledge in an organization; they are classified into two type’s namely situational characteristics and knowledge characteristics. A knowledge characteristic consists of a system user and the receiver by the organization. The risk that is faced by the organization in the transfer of knowledge is termed as stickiness in knowledge or knowledge stickiness. The methods and concepts involved in the process of knowledge transfer results in the stickiness during the construction of knowledge and it is also used to solve the issues while the knowledge is transferred in an organization [Dixon.H, 2000]. The stickiness in the process of knowledge transfer is based upon the following characteristics, they are knowledge character, sender and receiver and the growth between the sender and receiver [Wang et al, 2001]. While knowledge transfer is taking place, the above aspects are important to handle the stickiness carefully in order to enhance the process of knowledge transfer successfully.
The factors that affect the situational characteristics are source, recipient and the organizational context in an organization. The knowledge source is the initial factor of the process and their main aim is to transfer the knowledge since it is the most important factor for the success of an organization [Ghaziri .M, 2004]. Another aspect of the source which affects the transfer is the reliability of the knowledge transfer. The final source of aspect in situational characteristics is to complete the process of knowledge transfer.


Thus the knowledge transfer is the concept and fact that are present in the people’s minds. Knowledge transfer also aims at the process of communication that involves both information transmissions to the recipient and sender by a group or person in the organization. Therefore it is revealed from above that the role of knowledge transfer is the ability to act and also it evaluates the new information in an organization.


1. Dixon.H Common Knowledge: How Companies Thrive By Sharing Knowledge, Harvard University Press, Boston, 2000.
2. Awad.E.M and Ghaziri.M (Knowledge Management, Pearson Education, NJ, 2004.
3. Wang et al, Introduction to Knowledge Management: Principles and Practice, Tapir Academic Press, Trondheim, 2001.



Data management is defined as the process in which the data or information undergoes validation, storing, processing and protected in order to satisfy the requirements of the end users. Data is the heart of the information and it acts as a primary source of model through which the entire world is operating. It is also termed as the process of executing the procedures and policies that are needed for the effective management of the information or data.

Emerging Technologies for Effective Data Management

There are many technologies that are used for the effective management of data out of which some important technologies are discussed here. The first technology is the Usage of programs that are scripted for the analysis of data or information. In this method the data is thoroughly processed and analyzed to provide a metadata .In this technique the data can be revised at any point of time since a record will be created from the time of data collected till the publish [Gordon.K, 2007]. The second technology is the storage of data in the form of non proprietary software and hardware. In this technique the data stored will remain for many decades even when the software is lost. But if the data is stored in proprietary software or hardware the data will disappear if the software is lost and it is difficult to open the files [Mehdi.K, 1998]. The third method used for the effective data management is the usage of descriptive names for the files that contain data [Mathew.B, 2001]. This technique is the quickest way to represent the data in the file. This method consists of two approaches to be followed for effective management of data namely the first one is that the file name should not be long and the second one is that there should be space between the file names.


Thus from the above technologies mentioned it is revealed that the data can be effectively managed and it can be made free from errors and it also can be structured still more effectively for the analysis process. More importantly it is more interpretable for future research and for the management of the data. Hence these technologies are very useful to revise the data even in future and it can also be used for research purposes for many decades.


1.      Gordon.K Principles of Effective Data Management, British Informatics Society Limited, UK, 2007.

2.      Mathew.B Effective Management Of Data, California University, CA, 2001

3.      Mehdi.K Effective Data Management and Emerging Technologies, Idea Publishing, New York, 1998


Ratio is a term which indicates the association between any of the two inter - connected variables. That is, the ratios use to institute the association among two items that are stated in quantitative form. The ratio analysis is the analysis of financial statements (balance sheet, profit and loss statement, cash flow statement etc.) of the firms of a specific period and interpreting of financial results with the assistance of available ratios. In finance, a ratio is termed as an association between 2 numbers of similar type which generally uttered as a: b or ‘a to b’.
Ratio analysis
In ratio analysis, the financial ratios act as a tool. The required data for the ratio analysis used to be gathered from the financial statements of the firm. Generally, the ratio analyses are used to determine the financial safety and financial condition of a firm. The ratio analysis is considered as an essential technique to institute the association between two accounting numbers in order to highlight the important data to the management and the users who could assess the condition of the business and to view their concert in an expressive way. There are various analyses and researches (for example, Minter et al Eds. 1982; Kumbirai & Webb, 2010) on the use of ratio analysis in evaluating the financial performance of the firms. Added to that, the ratio analysis assists in the firms’ budgeting, long – term planning, strengthening the financial performance through asset management etc.
The ratios in the ratio analysis are classified into various types. They are liquidity ratio or quick ratio or liquid ratio or acid test ratio, solvency or financial structure or capital structure ratio, profitability ratio, activity ratio or efficiency ratio and coverage ratio.
Thus, it is clearly understood that ratio analysis helps in analyzing each and every financial aspect of a firm starting from its assets to liabilities and debt. This helps in understanding the financial condition and complete performance of an entity.
Minter et al Eds. (1982), Using ratio analysis to evaluate financial performance, New Directions for Higher Education, Issue 38, pages 25–38.
Kumbirai & Webb (2010), A financial Ratio Analysis of Commercial Bank Performance in South Africa, African Review of Economics and Finance, Print Services, Rhodes University, Vol. 2, No. 1.

Tuesday, 28 July 2015


Introduction to Cloud Computing:
Cloud computing can be referred as a new computing style in which dynamically scalable and always virtualized resources are offered as a service over internet. Cloud computing has become an essential trend of technology and several professionals expect that cloud computing will reshape information technology marketplace and information technology processes.
Benefits of Cloud Computing:
According to Armburst et al (2010) the most essential benefit of cloud computing is that firms can reduce their expenditures of capital and use operational expenditures for developing their capabilities of computing. This is a lower obstacle to entry and also needs fewer in house information technology resources to offer system assistance. Another benefit of cloud computing is that firms can initiate with a little deployment and develop at a big deployment rapidly and scale back if essential. Buyya et al (2009) has mentioned that the cloud computing flexibility permits firms to use additional resources at peak times enhancing them to fulfill the demands of customers. Maintenance, is also one of the benefits of cloud computing. The providers of cloud service perform the maintenance of system and access through application programming interfaces that do not need installations of applications onto personal computers. With the technology of cloud computing users use different devices involving laptops, personal computers, personal digital assistants and smart phones to access programs, application development and storage platforms over internet through services provided by the providers of cloud computing. Mather, Kumaraswamy and Latif (2009) have described that the features of cloud computing involves wide access of network, on demand self service, rapid elasticity, measured service and resource pooling. Another benefit is reliability where the services using numerous redundant sites can assist disaster recovery and continuity of business. The last benefit of cloud computing is mobile accessibility where mobile employees have developed productivity because of system accessible in an infrastructure feasible from anywhere.
Cloud computing develops profitability by developing utilization of resources. Costs are reduced by providing proper resources only for the time those resources are required. Cloud computing is altering the way information technology departments purchase information technology. Businesses have an extent of paths to the cloud involving platforms, infrastructure and applications that are accessible from providers of cloud as online services.   
Armbrust, M., Fox, A., Griffith, R., Joseph, A.D., Katz, R., Konwinski, A., Lee, G.,  atterson, D., Rabkin, A., Stoica, I. and Zaharia, M. (2010) A View of Cloud Computing, Communications of the ACM, 53, 4, 50-58.
Buyya, R., Yeo, C.S., Venugopal, S., Broberg, J. and Brandic, I. (2009) Cloud Computing and Emerging IT Platforms: Vision, Hype, and Reality for Delivering Computing as the 5th Utility, Future Generation Computer Systems, 25, 6, 599-616.

Mather, T., Kumaraswamy, S and Latif S (2009), Cloud security and privacy. Sebastopol, CA: O’Reilly Media, Inc


Code Division Multiple Access (CDMA) is a multiple access and modulation scheme based on the communication of spread-spectrum. CDMA is a new concept in the advanced wireless communications and digital cellular radio communications.
With CDMA, each signal comprises various pseudo-random binary sequences which set the carrier and spreads the waveform spectrum. CDMA is a system of direct sequence spread spectrum. Numerous CDMA signals share the same spectrum of frequency. If CDMA is viewed either in the time or frequency domain, the signals of multiple access appear to be on topmost of each other. The multiple access signals are separated at the receiver with the help of correlator that permits only signal energy from the selected sequence of binary and despreads its spectrum. The system of CDMA works directly on digital signals like 64 Kbit/Sec. These signals may be ISDN channels, digitized voice, modem data and more (Adachi et al., 1998).
According to Viterbi (1995) CDMA changes the subscriber station nature from an analog to digital device. CDMS receivers do not neglect analog processing completely, but they separate channels of communication by a means of a modulation of pseudo-random which is applied and removed in the domain of digital, non on the frequency basis. Multiple users occupy the band of the same frequency. CDMS is changing the face of PCS and cellular communication by improving the voice quality and removing the audible multipath fading effects, improving the capacity of telephone traffic, minimizing the incidence of dropped calls since handoff failures, providing reliable mechanism of transport for data communications like internet traffic, facsimile, simplifying site selection, reducing the number of sites required to support any provided amount of traffic, reducing the operating and deployment costs since fewer cells sites are required, minimizing average transmitted power, reducing potential health risks and minimizing interference to the other electronic devices (Sari et al., 2000).
It is concluded that CDMA is a radio access system and digital modulation which employs frequency bands in order to arrange continuous and simultaneous access to a wireless network by multiple users.
1.      Adachi.F, Sawahashi.M and Suda.H (1998), Wideband DS-CDMA for Next- Generation Mobile Communication Systems”, IEEE Communication, Mag., vol.36, pp. 56-59.
2.      Sari.H, Vanhaverbeke.F, Moeneclaey.M (2000), “Multiple access using two sets of orthogonal signal waveforms,” IEEE Communication. Lett., vol. 4, no. 1, pp. 4-6.

3.      Viterbi. A (1995), CDMA: Principles of Spread Spectrum Communication Addison-Wesley Wireless Communications Series.


Customer Relationship Management (CRM) is a managerial philosophy which helps to build a long term relationships with potential customers. Maintaining customer relationships is essential and valuable to the business organizations.
Stone et al (2002) point out that most sectors of the financial services were trying to implement the customer relationship management techniques in order to reach their goals. Banks were trying adopt these techniques of CRM such as create customer-centric culture and organization, integrate communications and supplier that is customer interactions across the channels, maximize customer profitability, secure customer relationships, identify sales prospects and opportunities, support pricing, channel management and migration, manage the value of customer by developing propositions aimed at various groups of customers and support cross and up-selling initiatives. Successful CRM concentrates on understanding the requirements and desires of the customer and this can be fulfilled by placing these requirements at the business heart by integrating them with organization’s technology, strategy, people and business processes.
Customer relationship management is a sound business strategy in order to find out the bank’s most profitable prospects and customers and banks have to spend time and attention in expanding the long term relationship with customers through individualized reprising, marketing, customized service and discretionary decision making through the different sales channels that the banks uses. Any financial institution which is trying to adopt a model for customer relationship then they should consider six key requirements for the business such as create a customer-focused infrastructure and organization, assess the lifetime customer value, gaining accurate picture of various customers, maximize the profitability in each and every customer relationship, perceive how to attract and retain the best customers and maximize the return rate on marketing campaigns (Chary and Ramesh, 2012).
It is concluded that CRM has emerged as a famous business strategy in the today’s competitive business. It involves advance and new marketing strategies which not only manage the existing customers but also obtain new customers.
1. Chary T. Satya Narayana & Ramesh, R. (2012). Customer Relationship Management in
Banking Sector- A Comparative Study, KKIMRC IJRHRM, 1 (2), 20-29. 
2. Stone, Merlin et al. (2002). The Evolution of CRM in Banking, Publication Kogan Page


Human Resource Information System (HRIS) has emerged as a vital tool to achieve the organizational objectives. HRIS is an integrated system to gather, store, record, manage and deliver useful information for human resource. HRIS shapes an interaction between information technology and human resource management.
HRIS is a process which utilizes the IT for the effective human resource management applications and functions. It is a computerized system consists of interrelated data base or data bases which track the employee’s information (Gill and Johnson, 2010). HRIS helps in recoding and examining employees and organizational documents and information like employee handbooks, safety procedures and emergency evacuation (Fletcher, 2005; Lee, 2008). It helps organizations to manage a complete, accurate and updated database which can be retrieved from manuals and reports (Gara, 2001). According to Mathis and Jackson (2002) based on the size of the organization, the nature of HRIS will vary, in the small scale organizations HRIS tends to be informal whereas in large organization HRIS will be more formal and coordinated.
HRIS is a management tool which is used to understand the patterns for human resource actions, policies and employee behaviors and also identifying the gaps in the system of human resource. HRIS is a software package which facilitates a complete system for the activities of human resource management in businesses (Aggarwal and Kapoor, 2012). The primary role of HRIS is integrating the information technology and human resource management which has lead to competitive advantage for the organizations. HRIS speed up the transaction processing, reduces administration cost, improve the tracking and control of actions in human resource and reduce information errors. HRIS enables efficiency, effectiveness and promotes competitiveness among the organizations (Lengnick et al., 2003).
It can be concluded that HRIS, is a computerized system which assists in the processing of information that is related to management of human resource and has become a critical part of all the organizations.
1.      Gill, J., Johnson, P., (2010). Research methods of managers, Sage Publications Limited, Fourth Edition
2.      Fletcher P. (2005), “From Personnel Administration to Business Driven Human Capital Management: The Transformation of the Role of HR in the Digital Age”, In Greutal and Stone (Eds), The Brave New World of Her, San Francisco, CA: Jossey-Bass, pp. 1-12.
3.       Lee A. (2008), “Relationship Between the Use of Information Technology and Performances of Human Resource Management”, PhD thesis, Alliant International University, San Diego: USA
4.       Gara S.J. (2001), “How HRIS Can Impact HR: A Complete Paradigm Shift for the 21st Century”, Society for Human Resource Management (SHRM), Available: http://www.shrm.org/whitepapers/documents/default.asp?page= 630 01.asp
5.       Mathis R.L. and Jackson J.H. (2002), Human Resource Management, 10th edition, USA: Thomson Learning, pp. 179 –207.
6.      Aggarwal.N and Kapoor.M (2012), Human Resource Information Systems (HRIS)-Its Role and Importance In Business Competitiveness, GIAN JYOTIE-Journal, Vol 1.
7.       Lengnick-Hall, Mark L., Moritz Steve (2003). The impact of e-HR on HRM function, Journal of Labor Research 24(3), 365-379.


Brain computer interface (BCI) is a fast-growing technology in that researchers aim to build a direct communication channel between the computer interface and human brain. BCI is software and hardware communications system which allows humans to interact with others without the involvement of muscles and peripheral devices and with the help of control signals generated from the activity of electroencephalographic.
Brain computer interface also known as Brain machine interface (BMI) which creates a new channel of non-muscular for transferring human’s intentions to external devices like speech synthesizers, computers, neural prostheses and assistive appliances. This is mainly attractive for persons with severe motor disabilities. This type of interface will improve their life quality and at the same time, BCI will minimize the cost of intensive care (Khalid et al., 2009). A brain computer interface is an artificial intelligence system which can identify some patterns set in brain signals in the five stages they are acquisition of signal, signal or preprocessing enhancement, feature extraction, classification and control interface. The signal acquisition captures and brain signals as well as performs artifact processing and noise reduction.
The preprocessing stage prepares the brain signals in an appropriate form for further processing. In the stage of feature extraction recognizes discriminative information in the signal of brain which will be recorded. Once measured, the brain signal is mapped onto a vector comprising discriminant and effective features from the noticed signals. The extraction of this useful information is a very complicated task. Brain signals are mixed with other signals. These signals come from a finite set of brain activities which overlap in both space and time. The brain signal will not usually stationary and twisted by artifacts like electro-oculography (EOG) or Electromyography (EMG). The feature vector will be sometimes in low dimension, in order to minimize the complexity in the feature extraction stage, but without useful information loss. The classification stage, it classifies the brain signals by considering feature vectors. The good discriminative features choice is important in order to achieve effective recognition for pattern, to decipher the person’s intentions. The control interface stage transforms the classified signals into informative or meaningful commands for any connected device like computer or wheelchair (Wolpaw et al., 2002).
It is concluded that brain computer interface is a control and communication channel which does not depend on the muscles and peripheral nerves. BCI will improve the life quality of the humans and also reduces the intensive care cost.
1.      Khalid, M.B.; Rao, N.I.; Rizwan-i-Haque, I.; Munir, S.; Tahir, F. Towards a Brain Computer Interface Using Wavelet Transform with Averaged and Time Segmented Adapted Wavelets. In Proceedings of the 2nd International Conference on Computer, Control and Communication (IC4’09) Karachi, Sindh, Pakistan, February 2009; pp. 1–4.
2.      Wolpaw, J.R.; Birbaumer, N.; McFarland, D.J.; Pfurtscheller, G.; Vaughan, T.M. Brain-computer interfaces for communication and control. Clin. Neurophysiol. 2002, 113, 767–791


The technology of grid computing is a set of methods and techniques that are applied for the combined use of multiple servers. These multiple servers are specialized and work as logic, single integrated system.
According to Abbas (2005) grid computing technology permits accessing, strengthening and managing information technology resources in an environment of distributed computing. Grid computing is advanced distributed technology that combines into single system like applications, databases and servers using specialized software. Based on the various complexity levels for the enterprise, grids are categorized as intra-grid, infra-grid, extra-grid and inter-grid. The grid computing technology requirements are fault tolerance, scalability, security, global name spaces, adapting heterogeneity, non-persistence, extensibility, persistence, complexity autonomy and management. Grid computing saves financial resources both in operating and capital costs.
Infra-grid architecture allows optimizing the sharing of resource within the organization’s departments division. Infra-grid forms a tightly controlled architecture with well defined business integration, policies and security. Intra-grid is more complex implementation than the infra-grid since it’s concentrates on integrating various resources of certain divisions and departments of an enterprise. These types of intra-grid require a complex sharing resources and security policies and data. Extra-grid is referring to sharing of resource from or to a foreign partner towards certain relationships which are established. The extra-grid extends over the local resources administrative management of an enterprise and thus mutual convention on maintaining the access to resources are significant. Inter-grid computing technology enables storage and sharing resources and data using the web and allowing the collaborations between various organizations and companies. The grid complexity comes from the special needs or requirements of security, service levels and integration (Marin, 2011).
It is concluded that grid computing technology helps sharing and distributing data that enables a collaborative enhancement at both outside and inside the enterprise. Grid computing saves both the operating and capital costs.
1.      Abbas.A (2005), Grid Computing: A Practical Guide to Technology and Applications, Editura Charles River Media. 
2.      Marin.G (2011), Grid Computing Technology, Database Systems Journal, Vol 2, Pg no: 13-23.