Saturday, July 18, 2009

MIS and LMIS

MIS

Management Information System refers broadly to a computer-based system that provides managers with the tools for organizing, evaluating and efficiently running their departments. In order to provide past, present and prediction information, an MIS can include software that helps in decision making, data resources such as databases, the hardware resources of a system, decision support systems, people management and project management applications, and any computerized processes that enable the department to run efficiently.

The development and management of information technology tools assists executives and the general workforce in performing any tasks related to the processing of information. MIS and business systems are especially useful in the collation of business data and the production of reports to be used as tools for decision making.
Applications of MIS
With computers being as ubiquitous as they are today, there's hardly any large business that does not rely extensively on their IT systems.

However, there are several specific fields in which MIS has become invaluable.

1. Strategy Support

While computers cannot create business strategies by themselves they can assist management in understanding the effects of their strategies, and help enable effective decision-making. MIS systems can be used to transform data into information useful for decision making. Computers can provide financial statements and performance reports to assist in the planning, monitoring and implementation of strategy. MIS systems provide a valuable function in that they can collate into coherent reports unmanageable volumes of data that would otherwise be broadly useless to decision makers. By studying these reports decision-makers can identify patterns and trends that would have remained unseen if the raw data were consulted manually.

MIS systems can also use these raw data to run simulations – hypothetical scenarios that answer a range of ‘what if’ questions regarding alterations in strategy. For instance, MIS systems can provide predictions about the effect on sales that an alteration in price would have on a product. These Decision Support Systems (DSS) enable more informed decision making within an enterprise than would be possible without MIS systems.

2. Data Processing

Not only do MIS systems allow for the collation of vast amounts of business data, but they also provide a valuable time saving benefit to the workforce. Where in the past business information had to be manually processed for filing and analysis it can now be entered quickly and easily onto a computer by a data processor, allowing for faster decision making and quicker reflexes for the enterprise as a whole.

Management by Objectives

While MIS systems are extremely useful in generating statistical reports and data analysis they can also be of use as a Management by Objectives (MBO) tool. MBO is a management process by which managers and subordinates agree upon a series of objectives for the subordinate to attempt to achieve within a set time frame. Objectives are set using the SMART ratio: that is, objectives should be Specific, Measurable, Agreed, Realistic and Time-Specific.

The aim of these objectives is to provide a set of key performance indicators by which an enterprise can judge the performance of an employee or project. The success of any MBO objective depends upon the continuous tracking of progress. In tracking this performance it can be extremely useful to make use of an MIS system. Since all SMART objectives are by definition measurable they can be tracked through the generation of management reports to be analyzed by decision-makers.

Benefits of MIS

The field of MIS can deliver a great many benefits to enterprises in every industry. Expert organisations such as the Institute of MIS along with peer reviewed journals such as MIS Quarterly continue to find and report new ways to use MIS to achieve business objectives.

Core Competencies

Every market leading enterprise will have at least one core competency – that is, a function they perform better than their competition. By building an exceptional management information system into the enterprise it is possible to push out ahead of the competition. MIS systems provide the tools necessary to gain a better understanding of the market as well as a better understanding of the enterprise itself.

Enhance Supply Chain Management

Improved reporting of business processes leads inevitably to a more streamlined production process. With better information on the production process come the ability to improve the management of the supply chain, including everything from the sourcing of materials to the manufacturing and distribution of the finished product.

LMIS

Information Systems Leadership is a continuing information systems innovation meeting customers’ needs. This implies not only creativity in developing new information systems and enhancing existing ones, but also astute market knowledge to ensure that they sell. The strategy involves delivering a continuous stream of new information systems and/or services, where what is new is valued by the customers. The rapid gain of market acceptance and market share were due not only to the innovative information systems itself but to new systems to control the manufacturing and distribution of the information systems, which is more akin to fast-moving consumables than traditional eye-care information systems. 3M has traditionally followed an information systems leadership strategy in the adhesives and coating market, and the story of Post-it notepads is now legendary—how a ‘failed’ new adhesive became the basis for a best-selling information systems—what would we do without it?
The management of ‘demand’ and ‘supply’ and achieving balance between both is complex. The previous section illustrated that the debate is generally portrayed as alternating between centralization and decentralization. However, the ‘middle ground’ has become an appealing alternative. Von Simson, for example, subscribes to an IS functional design with IS/IT roles played by both a central IS function and the business units and prescribes a ‘centrally decentralized’ IS function with strong dotted-line reporting relationships. He argues that clear structures and distinct roles and responsibilities must be defined with a mix of centralized and decentralized resources. Otherwise, confusion, conflict, duplication of effort and/or inadequate systems integrity will occur. In a similar vein, the federal structure is often seen as capturing the benefits of both centralization and decentralization. With such a structure, business units receive a responsive service from decentralized IS functions, while at the same time a corporate IS function provides groupwide IT services and exerts some degree of central leadership and control of IT activities. While intellectually appealing, little guidance can be found as to what these decision areas are and how to make it work. The key questions are what aspects of IS/IT are best managed centrally and which are best devolved—degree of diffusion in Sullivan’s terms— and whether IS/IT activities are managed by a specialist IS function at all or should they be managed by business management themselves.
IT leadership, which includes IT envisioning, fusing IT strategy with business strategy, and managing IS resources. The leadership exhibited by the information officer is a key aspect in achieving success with IS. Two components of leadership of critical importance for the information officer are:
1. Ability to create a set of value expectations shared across all areas of the business—one sensitive to the realities of competency, competition and culture.
2. Ability to deliver on those expectations measurably information officers must understand and express IT’s value in a way that’s meaningful to all executives.
Success in improving the contribution of IS/IT is initially premised on having strong IS leadership within the IS function and the importance of the IS Director/CIO having credibility within the business. The data from this research highlighted the importance of first getting the basics right—network uptime, availability and reliability of applications, help-desk response times, etc. Leadership by example appears to be key in achieving a truly open knowledge environment. As an emerging topic of study within the field of IS, we have much to learn about how knowledge can be effectively ‘managed’ before we can understand how best to deploy IT to improve the processes involved.

Wednesday, July 15, 2009

My Learnings/Reflections in COMDDAP 2009

I attended the Open Source for Business Applications seminar last July 3, 2009 which was presented by Mr. Ben Alegre and Paul de Paula of Spinweb Productions, Inc. I learned that the Open Source is an approach to the design, development, and distribution of software, offering practical accessibility to a software's source code. And Open Source Software is defined as computer software for which the source code and certain other rights normally reserved for copyright holders are provided under a software license that meets the Open Source Definition or that is in the public domain. This permits users to use, change, and improve the software, and to redistribute it in modified or unmodified forms. It is very often developed in a public, collaborative manner. Open source software is the most prominent example of open source development and often compared to user-generated content. The term open source software originated as part of a marketing campaign for free software. A report by Standish Group states that adoption of open source software models has resulted in savings of about $60 billion per year to consumers.
I discovered that the Open Source Definition is used by the Open Source Initiative to determine whether or not a software license can be considered open source. Mr. Ben Alegre presented the following criteria of the distribution terms of open-source software to comply with:

1. Free Redistribution
The license shall not restrict any party from selling or giving away the software as a component of an aggregate software distribution containing programs from several different sources. The license shall not require a royalty or other fee for such sale.

2. Source Code
The program must include source code, and must allow distribution in source code as well as compiled form. Where some form of a product is not distributed with source code, there must be a well-publicized means of obtaining the source code for no more than a reasonable reproduction cost preferably, downloading via the Internet without charge. The source code must be the preferred form in which a programmer would modify the program. Deliberately obfuscated source code is not allowed. Intermediate forms such as the output of a preprocessor or translator are not allowed.

3. Derived Works
The license must allow modifications and derived works, and must allow them to be distributed under the same terms as the license of the original software.

4. Integrity of The Author's Source Code
The license may restrict source-code from being distributed in modified form only if the license allows the distribution of "patch files" with the source code for the purpose of modifying the program at build time. The license must explicitly permit distribution of software built from modified source code. The license may require derived works to carry a different name or version number from the original software.

5. No Discrimination Against Persons or Groups
The license must not discriminate against any person or group of persons.

6. No Discrimination Against Fields of Endeavor
The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

7. Distribution of License
The rights attached to the program must apply to all to whom the program is redistributed without the need for execution of an additional license by those parties.

8. License Must Not Be Specific to a Product
The rights attached to the program must not depend on the program's being part of a particular software distribution. If the program is extracted from that distribution and used or distributed within the terms of the program's license, all parties to whom the program is redistributed should have the same rights as those that are granted in conjunction with the original software distribution.

9. License Must Not Restrict Other Software
The license must not place restrictions on other software that is distributed along with the licensed software. For example, the license must not insist that all other programs distributed on the same medium must be open-source software.

10. License Must Be Technology-Neutral
No provision of the license may be predicated on any individual technology or style of interface.

In the seminar, Mr. Ben Alegre showed many Open Source Technologies and these are Ubuntu, OpenOffice, Mozilla Firefox, Thunderbird, Pidgin, GIMPshop, Transmission, LimeWire, VLC, LAMP, Drupal, Joomla, Wordpress, OS Commerce, Zen Cart, Ruby on Rails, PHP Cake, Civi CRM, and Moodle. I learned that in the Open Source, there is no vendor lock-in, there are large support communities, and it is more secure. Like the Proprietary Software, Open Source is reliable, proven performance, highly scalable, and widely used. And, there are many applications that the Open Source Software (OSS) can apply with in the web. It can apply in Portals, Corporate, Publishing, Government, Education, Art, Music, Multimedia, Social Networking Sites, E-commerce, and CRM.

I also attended the Hewlett-Packard Thin Client Server Computing seminar the same day which was presented by Nexus Technologies, Inc. I imbibed that Thin Client is a client computer or client software in client-server architecture networks which depends primarily on the central server for processing activities, and mainly focuses on conveying input and output between the user and the remote server. In contrast, a thick or fat client does as much processing as possible and passes only data for communications and storage to the server.
I discovered that many thin client devices run only web browsers or remote desktop software, meaning that all significant processing occurs on the server. However, recent devices marketed as thin clients can run complete operating systems such as Debian Linux, qualifying them as diskless nodes or hybrid clients. Some thin clients are also called "access terminals." I realized that many people that already have computers want the same functionality that a thin client has. The presenter of the seminar said that computers can simulate a thin client in a single window (as thru a browser) or with a separate operating system boot-up. Either way, these are often called "fat clients" to differentiate them from thin clients and computers without thin-client functionality.

I find the several advantages of thin clients and these are:
· Lower IT administration costs. Thin clients are managed almost entirely at the server. The hardware has fewer points of failure and the client is simpler (and often lacks permanent storage), providing protection from malware.

· Easier to secure. Thin clients can be designed so that no application data ever resides on the client (just whatever is displayed), centralizing malware protection and reducing the risks of physical data theft.

· Enhanced data security. Should a thin-client device suffer serious mishap or industrial accident, no data will be lost, as it resides on the terminal server and not the point-of-operation device.

· Lower hardware costs. Thin client hardware is generally cheaper because it does not contain a disk, application memory, or a powerful processor. They also generally have a longer period before requiring an upgrade or becoming obsolete. There are fewer moving parts and one upgrades the server and network instead because the limitation on performance is the display resolution which has a very long life cycle. Many thick clients are replaced after 3 years to avoid failures of hardware in service and to use the latest software while thin clients can do the same, well-defined task of displaying images for 10 years. The total hardware requirements for a thin client system (including both servers and clients) are usually much lower compared to a system with fat clients. One reason for this is that the hardware is better utilized. A CPU in a fat workstation is idle most of the time. With thin clients, CPU cycles are shared. If several users are running the same application, it only needs to be loaded into RAM once with a central server (if the application is written to support this capability). With fat clients, each workstation must have its own copy of the program in memory.

· Less energy consumption. Dedicated thin client hardware has much lower energy consumption than typical thick client PCs. This not only reduces energy costs but may mean that in some cases air-conditioning systems are not required or need not be upgraded which can be a significant cost saving and contribute to achieving energy saving targets. However, more powerful servers and communications are required.

· Easier hardware failure management. If a thin client fails, a replacement can simply be swapped in while the client is repaired; the user is not inconvenienced because their data is not on the client.

· Worth less to most thieves. Thin client hardware, whether dedicated or simply older hardware that has been repurposed via cascading, is less useful outside a client-server environment. Burglars interested in computer equipment may have a much harder time fencing thin client hardware.

· Operable in Hostile Environments. Most thin clients have no moving parts so can be used in dusty environments without the worry of PC fans clogging up and overheating and burning out the PC.

· Less network bandwidth. Since terminal servers typically reside on the same high-speed network backbone as file servers, most network traffic is confined to the server room. In a fat client environment if you open a 10MB document that’s 10MB transferred from the file server to your PC. When you save it that’s another 10MB from your PC to the server. When you print it the same happens again — another 10MB over the network to your print server and then 10MB onward to the printer. This is highly inefficient. In a thin client environment only mouse movements, keystrokes and screen updates are transmitted from/to the end user. Over efficient protocols such as ICA or NX this can consume as little as 5 kbit/s bandwidth. This statement makes some very heavy assumptions about the operating environment, though.

· More efficient use of computing resources. A typical thick-client will be specified to cope with the maximum load the user needs, which can be inefficient at times when it is not used. In contrast, thin clients only use the exact amount of computing resources required by the current task – in a large network, there is a high probability the load from each user will fluctuate in a different cycle to that of another user (i.e. the peaks of one will more than likely correspond, time-wise, to the troughs of another. This is a natural result of the additive effect of many random, independent loads. The total load will be normally distributed about a mean and not the sum of the maximum possible loads.
· Simple hardware upgrade path. If the peak resource usage is above a pre-defined limit, it is a relatively simple process to add another component to a server rack (be it power, processing, storage), boosting resources to exactly the amount required. The existing units can continue to serve alongside the new, whereas a thick client model requires an entire desktop unit be replaced, resulting in down-time for the user, and the problem of disposing of the old unit.

· Lower noise. The aforementioned removal of fans reduces the noise produced by the unit. This can create a more pleasant and productive working environment.

· Less wasted hardware. Computer hardware contains heavy metals and plastics and requires energy and resources to create. Thin clients can remain in service longer and ultimately produce less surplus computer hardware than an equivalent thick client installation because they can be made with no moving parts. Computer Fans and disk storage (used for cooling and storage in thick clients) have mean times before failures of many thousands of hours but the transistors and conductors in the thin client have mean times before failure of millions of hours. A thick client is considered old after one or two cycles of Moore’s Law to keep up with increasing software bloat but a thin client is asked to do the same simple job year after year. A thin client, on the other hand will be replaced only when it lacks some feature deemed essential. With audio, video, and USB, thin clients have changed little in 15 years, being essentially, stripped-down PCs.

Reference:
WIKIPEDIA.org
http://www.nexus.com.ph
http://www.spinweb.ph

Tuesday, July 14, 2009

Ways of Green Campus Computing

Green computing techniques are easy to incorporate. And it will result in: a reduction in overall operating costs by reducing power use, using shared hardware resources, reusing similar systems, and reducing supplies such as toner, ink and paper, enhanced work environments such as campus computer lab space and office work space with reduced noise pollution and eye strain from traditional CRTs, corporate and social responsibility through a focus on the Triple Bottom Line, an expanded set of success values focusing on people, planet and profit, an enhanced University Image: green computing solutions on the U campus can be used as marketing tools for potential students and researchers.

Putting The University Laboratory’s Computers To Sleep

When you're not using your computer, you can save energy by putting it to "sleep." When your computer is in sleep, it's turned on but in a low power mode. It takes less time for a computer to wake up from sleep than it does for the computer to start up after being turned off.

You can put your computer to sleep right away by choosing Apple menu > Sleep. You can also choose to put the computer to sleep automatically when your computer has been inactive for a specified amount of time. You can also set only the display to sleep. If your computer is in the middle of a task that you want to let finish while you are away (for example, burning a DVD), you should set only the display to sleep.

Open Power Options in Control Panel. In Power Schemes, click the down arrow, and then select a power scheme. The time settings for the power scheme are displayed in System standby, Turn off monitor, and Turn off hard disks. To turn off your monitor before your computer goes on standby, select a time in Turn off monitor. To turn off your hard disk before your computer goes on standby, select a time in Turn off hard disks.

The University Should Go Green to Save Money

Relocate a college's server computers next to a solar-power generator. Replace AC power with DC power. Cool the servers only where they get the hottest. Put the servers in the ocean and power them with waves. To supply computers directly with local DC power. Computers generally use direct current, but the public electricity grid typically supplies alternating current, and 30 percent of the electricity can be lost in the conversion of one form to the other. Intelligent measuring systems like Greenlight should be extended to allow engineers to more precisely determine how to use energy.

The University Offices Computer-Generated Waste Should Properly Disposed

Important steps toward green computing include modifying paper and toner use, disposal of old computer equipment and purchasing decisions when considering new computer equipment.

Paper Waste
• Print as little as possible. Review and modify documents on the screen and use print preview. Minimize the number of hard copies and paper drafts you make. Instead of printing, save information to disks, or USB memory sticks.
• Recycle waste paper, have a recycle bin at each community printer and copier location.
• Buy and use recycled paper in your printers and copiers. From an environmental point of view, the best recycled paper is 100 percent post-consumer recycled content.
• Save e-mail whenever possible and avoid needless printing of e-mail messages.
• Use e-mail instead of faxes or send faxes directly from your computer to eliminate the need for a hard copy. When you must fax using hard copies, save paper using a "sticky" fax address note and not a cover sheet.
• On larger documents, use smaller font sizes (consistent with readability) to save paper.
• If your printer prints a test page whenever it is turned on, disable this unnecessary feature.
• Before recycling paper, which has print on only one side, set it aside for use as scrap paper or for printing drafts.
• When documents are printed or copied, use double-sided printing and copying. If possible, use the multiple pages per sheet option on printer properties.
• When general information-type documents must be shared within an office, try circulating them instead of making an individual copy for each person. Even better, make the document electronically available to the audience and display it on a projector.

Electronic Waste
• Use the campus network where possible to transfer files. This avoids the need to write CDs or DVDs or use floppy diskettes.
• Use USB memory sticks instead of CDs, DVDs, or floppies.
• Use re-writable CDs and DVDs.
• There are hopes of the University Recycling program addressing e-waste in the near future.

Reference:
www.utah.edu
http://greencampus.winserve.org
www.dailyutahchronicle.com


Saturday, July 11, 2009

My thoughts on automated elections with reference to the current situation


The past manually-done Philippine elections rely heavily on manual tallying and canvassing of votes. This kind of election makes the processes vulnerable to control and manipulation by traditional politicians and those with vested interests. As I have searched from www.inquirer.net, I saw the basic problems afflicting electoral system. And these are (a) outdated electoral process; (b) failure to implement the electoral modernization law; (c) limited administrative and regulatory capabilities of the COMELEC; (d) ineffective educational/information campaigns on new laws and policies; (e) weak political party system; (f) unaccountable political financing; and (g) defective party list system (Governance Assessment, 2003). Filipinos are so sick and tired of this manual process of election. And, I do not want to hear another Hello Garci Scandal this coming election.

I hope automated Philippine election will happen in 2010 election to ensure a credible and transparent electoral process. The modernization of the electoral system through computerization shall be supported to ensure the credibility of polls and correct the deficiencies in the electoral system. With recent settlement of differences between Smartmatic and TIM, this is for the betterment of the country. Filipino people have been desiring for automated elections for the past four decades. I hope the reconciliation is genuine so that we have an election system in 2010 that has integrity, security, and veracity. I hope this project of government is not another fraudulent deal like the Mega Pacific contract. Related to this event, the two companies, Smartmatic and TIM agreed to consult a Singaporean neutral arbiter should disagreements arise or to submit to a Singaporean court that would apply commercial arbitration rules, in case of a disagreement in the future. This development is good for the country to solidify the full automation of 2010 election.

With the recent news this week, this 2010 Election will hire almost 80 T nformation technology (IT) people who will assist the Comelec in conducting the automated election and to handle the voting machines. This is a big help in terms of financial aspect to all techiguys in a short period of time. And, it will be great experience for them to serve the country and countrymen in an information technology way.

Truly, our election needs more than an automated election. The Election Code must be further revised and amended to respond to the needs of the present electoral system. Measures to strengthen the party system and regulate the activities of political parties shall be created. State financing of political parties shall also be considered through the passage of the Campaign Finance Bill. The Comelec’s capacity to raise the level of political discourse and educate citizens regarding their right to vote will be enhanced. This will be done through conduct of continuing citizen and voter education through partnership with civil society groups and other government institutions. The electorate must be empowered with information that would help them vote intelligently. The challenge is to develop the people’s appreciation of their vote as a means to reform the government and receive better services from it. Part of this challenge is the need to raise the awareness of the electorate on relevant issues and the corresponding platforms of the candidates, if the country is to shift from the politics of personality to the politics of party programs.

Reference:
INQUIRER.net

Friday, July 10, 2009

The Risks Associated with Business and IS/IT Change in Dole Davao

I and group 1 members namely: Anthony Rigor Aguilar, Athina Alorro, Jerusalem Alvaira, Michael George Guanzon had an interview with the MIS Programmer of Dole Davao and her name is Cristine Galindo. The interview was conducted in Dole Satellite Office located in SJRDC Building at around 12 noon last July 2. This interview was recorded through celfone recorder and PSP recorder.

Company Profile: Dole Philippines

Dole Food Company's worldwide team of growers, packers, processors, shippers and employees is committed to consistently providing safe, high-quality fresh fruit, vegetables, and food products, while protecting the environment in which its products are grown and processed. Dole's dedication to quality is a commitment solidly backed by: comprehensive programs for food safety, scientific crop protection programs, stringent quality control measures, state-of-the-art production and transportation technologies, continuous improvement through research and innovation, and dedication to the safety of our employees, communities and the environment. Dole is committed to nutrition education to communicate to the public the health benefits of eating a diet rich in fruits and vegetables. Dole is a founding member of the National 5 A Day for Better Health Program and is a leader in developing technology-based nutrition education programs for children.


Based on Dole Philippines (Davao) that I visited and its MIS Programmer that I interviewed, the risks associated with business and IS/IT change are:

1. Failure of the new System. This means that the new system does not achieve the full functionality of the old system. It also means that new system does not live up to the expectation of end-users. In this case, Backup, Recovery, and Business Continuity must be addressed by the system analyst. Disaster Recovery Planning is often not sufficiently addressed or is low on the priority list, as there is no immediate, detrimental impact to the entity, until a disaster or other situation preventing normal operations arises. The business entity should develop a Disaster Recovery Plan (DRP) that will cope with the unavailability of the computer application(s) during an unexpected outage. This plan should be written, approved by management and tested on a regular basis. The plan would address how the entity would recover from short or long-term outages, as well as how operations would continue during the recovery effort.

2. End-users will not accept the new System. The employees who will use the new system may not like the interface or the graphical user interface (GUI) because they are familiarized with the old system’s interface. This problem can be solved through proper training. The success of any application is greatly dependent upon the training provided to the end-users initially and on a continuing basis. Continuing education is necessary to ensure employees are aware of, and proficient with, application enhancements and new releases. Recurring education also addresses training needs of new employees. Adequate training curriculum must be available to application users

3. Synchronization of all applications of the new System. Synchronization of the system means that the functions and applications of the system occur at the same time or proceed at the same rate. Synchronization also means simultaneous flow of functions of the new system. In this factor, we have the Program Change Control. The purpose of program change control is to ensure that only appropriate changes to program logic are made, performed in a timely manner, do not negatively impact other logic and ultimately produce the results expected by the user that requested the change. We also have System Interfaces factor. The endless pursuit of efficiency gains has resulted in the ability to transfer data from one system to another electronically rather than expending time keying data into both systems. The exchange of data from one business application to another is considered an interface. The accuracy and completeness of data files transmitted to, or received from, other applications should be assured by a quality control process consistent with the receiving application’s edit standards.

4. Data must change not change in the new System. In this instance, the term Data Integrity arises. The purpose of data integrity is to ensure complete and accurate data, which can be reported in any manner users require, with all fields formatted according to data definition rules and within established ranges (date fields should not allow month>12, day>31, and a code field should only be populated with valid values). The risk of internal fraud increases if individuals are granted the ability to modify program logic as well as production data. Adequate data input edits in place to prevent data corruption (on-line or batch)

5. Upgrading from old system to the new System. Upgrading means to advance or raise to higher a new system. One feature of upgrading is Data Access Security. The purpose of data access security within any application is to grant users appropriate access privileges necessary for the job they perform while restricting privileges not needed for their job or that could create weaknesses in the Internal Control structure of the entity. Duties and responsibilities assigned to each job role should be defined by management that ensure adequate segregation of duties. Those job role definitions can then be used to establish specific application permissions granted and/or restricted. Data access security should also provide an audit trail, which could be utilized to identify specific users that made individual changes to the data. Network environments which allow users to access the data directly (typically via a database utility such as Paradox or MS Access), effectively voids data access security within the application and should not be allowed. This type of access allows users full update capability with no audit trail. Another feature is Network Security. This higher level of security would typically grant users the ability to access an application, then administration of the specific application security would be utilized to grant and/or restrict data access as necessary within the application. Networks become more complex as more efficient, effective and secure products are made available through advances in Information Systems technology. Preventive measures can reduce the risk associated with threats inherently caused by advances in technology.