Categories
Computer

How Has Computer Hacking Interfered Modern Society Essay

How has computing machine choping inferred modern society? In this transition I will be speaking briefly about the basicss of computing machine choping from the yesteryear to the present. Computer hacking has changed more over clip ensuing in computing machine outgrowths to corporate system closures. This research paper will be speaking about three major parts of computing machine hacking. The first construct of hacking is the beginning of creative activity. The following portion will be how hacking has affect on the contemporary society.
Finally. the last piece of information will be traveling over the hereafter of system choping. What is a drudge you may inquire your ego but non hold an reply or one word phrase for the term? A drudge has ever been a sort of cutoff or alteration. a manner to short-circuit or make over the standard operation of an object or system. The first computing machine hackers emerge at MIT. They borrow their name from a term to depict members of a theoretical account train group at the school who hack the electric trains. paths. and switches to do them execute faster and otherwise.
A few of the members transfer their wonder and set uping accomplishments to the new mainframe calculating systems being studied and developed on campus. Choping groups begin to organize. Among the first are Legion of Doom in the United States. and Chaos Computer Club in Germany. The film “War Games” introduces the populace to choping. A computing machine hacker intends to destroy concerns executing an act much more unprincipled than an enthusiastic life scientist ‘hacking’ off at work or theory. The truth is that computing machine hacking is in fact easy in the general sense. but more consideration must be given.

Some facets of choping are used in mundane life and you may non cognize that accessing wireless cyberspace from another person’s history is considered wireless choping even though your adoption there connexion. During the 1970’s. a different sort of hacker appeared: the phreaks or phone hackers. They learned ways to chop the telephonic system and do phone calls for free. Within these group of people. a phreaker became celebrated because a simple find. John Draper. besides known as Captain Crunch. found that he could do long distance calls with a whistling.
He built a blue box that could make this and the Esquire magazine published an article on how to construct them. Fascinated by this find. two childs. Steve Wozniak and Steve Jobs. decided to sell these bluish boxes. get downing a concern friendly relationship which resulted in the initiation of Apple. By the 1980’s. phreaks started to migrate to computing machines. and the first Bulletin Board Systems ( BBS ) appeared. BBS are like the yokel groups of today. were people posted messages on any sort of subject.
The BBS used by hackers specialized in tips on how to interrupt into computing machines. how to utilize stolen recognition card Numberss and portion stolen computing machine watchwords. It wasn’t until 1986 that the U. S. authorities realized the danger that hackers represented to the national security. As a manner to antagonize this threat. the Congress passed the Computer Fraud and Abuse Act. doing computing machine interrupting a offense across the state. During the 1990’s. when the usage of the cyberspace became widespread around the universe. hackers multiplied. but it wasn’t until the terminal of the decennary that system’s security became mainstream among the populace.
Today. we are accustomed to hackers. crackers. viruses. Trojans. worms and all of the techniques we need to follow to battle them. Hackers were classified into three unusual types the first class is called In-house hacker. In-house hacker is an employee who is responsible for operating and keeping the system. who interacts instantly with the system as a coder or informations entry employee and is cognizant of all the system security capablenesss and spreads. He and should be the guard of the system but for different motives he hacks the system and gets what he needs bewraying all the trust given to him.
The 2nd type is called ace hacker who doesn’t interact with the system but at the same clip proctors the system on day-to-day footing and has an oculus on what is traveling on and what type of informations is entered at what clip so depending on the entries he decides the minute he should acquire these information and recover them for personal motives while the 3rd type is called professional hacker and this hacker is really strong and capable of acquiring any type of informations. he has the ability of carrying the user or the operator to supply him with the needed information by programming fast ones or user friendly screens and this sort of hackers frequently gets alone preparation specially when being used in military undertakings as what happened in the cold war.
Thesiss are merely brief ways hackers have impacted the modern universe we all live in. Within the past twelvemonth at that place have been two major instances right in the country. Both involve extended harm. and both are presently in tribunal. The closest instance is that of Thomas Crandall. otherwise known as St. Elmo’s Fire. Crandall is accused of estroying attending and subject records in a computing machine at Central Technical and Vocational Center. Police charge that Crandall used a personal computing machine at his place to entree the computing machine. He is besides accused of making $ 25. 000 in harm to files at Waste Management Inc. of Oakbrook. Ill. Crandall’s lawyer claims that many other pupils besides had entree to the computing machine. and that to individual out Crandall in unjust. Hackers are responsible of the immense development in computing machine and cyberspace engineering. but these yearss we consider them as stealers and interlopers who penetrated our ain privateness and used the accomplishments they were buttockss for their ain benefit.
Hackers have different sentiments and motives. However. they all portion the spirit of challenge and ever seeking to turn out their capablenesss of making what all believe is impossible possibly because they were mistreated. or uncertainties surrounded their abilities and past accomplishments. Hackers believe that information should be shared and they fight against information owning. Effectss that choping caused and still doing to the society can’t be ignored. Hacking nowadays is taking new stages and the danger is increasing because we are now populating in a society that runs by ICT. and any onslaught to the ICT particularly in advanced states will do critical effects.
ICT still lacks a powerful security tools that are capable of tracking. catching hackers. and protecting computing machine systems from their onslaughts. My ain position is that the best manner to protect ICT from hackers is to analyze their psychological science and seek to understand their manner of thought. because hackers are human existences. who have two sides ; evil and good. and they used to demo their good side but all of a sudden they turned to be evil. The grounds which caused the transmutation from good to evil persons should be studied and given the highest precedence in the war against hackers because since we put our custodies on the cause. we can repair it to make for better effects.
Peoples can grocery store. earn grades. receive bank statements and pay measures from their laptop or Personal computer. The possibilities are endless when it comes to simplifying life with the aid of the World Wide Web. but at the same clip possibilities are eternal hackers to perplex your life with cyber offenses. The Merriam-Webster Dictionary defines a hacker as both “an expert at programming and work outing jobs with a computer” and “a individual who illicitly additions entree to and sometimes tamping bars with information in a computing machine system. ” Those three grounds I have stated above were the hackings past present and future. Until engineering Michigans turning the possibility of hackers is limited less.

Categories
Computer

Dell Computer Value Chain

Dell Computer Corporation Background: Founded in 1984 by Michael Dell with the aim of building relationships directly with customers. Dell is a premier provider of PC products and services sought by customers worldwide to build their information technology and internet infrastructures. Through its direct business model it designs, manufactures and customises products and services to customer requirements and offers an extensive selection of software and peripherals. Dell’s operations using Porter’s value chain – Inbound logistics:Dell has three main factories globally in Austin, Texas; Limerick, Ireland and Penang, Malaysia. The majority of components have to be warehoused within close proximity so suppliers are located close to the factory plants as this facilitates the ease of dispatch of goods whilst lowering costs. Dell established partnerships with its major suppliers using JIT (Just-In-Time) deliveries.
Where necessary, Dell provides sales forecasts to suppliers for non-JIT deliveries. To minimise inventory costs Dell opted not to take delivery of bulky items such as monitors and speakers.Such items as dispatched directly to the customer from the suppliers warehouses. Operations: Time and cost savings are key issues and Dell utilises the JIT manufacturing process, loading software and testing the PC’s assembled to order. Finished goods are kept to an extremely low-level hence minimising the risks associated with buffer stock. Dell sees its manufacturing process as a way to cut costs and maintain its competitive advantage. Outbound logistics: Dell provides direct delivery by courier of the finished goods to the final customer.
Supplies of sub-assembly components are delivered directly to the customer by the supplier. Marketing and Sales: Dell utilises telesales, media (TV, newspapers and magazines) advertisements in addition to online ads as marketing tools. There are provisions for customer advice on PC specifications and price. This leads to more up to date product specifications due to low quantities of buffer stocks being held in-house. Dell uses its marketing and sales to continually improve and develop its relationship with the end-user/customer. Services:Dell provides installation services by Dell experts as well as 24/7 online support for large businesses and institutions as well as for small businesses and home PC users. Asset recovery and recycling services in an environmentally friendly manner are offered.

PC support services in case of malfunctions and protection services against accidental damage are provided. Procurement: Dell built strong supplier relations through their close proximity to the factories in return for guaranteed orders. In this way suppliers inventory levels rarely pile up thereby keeping their costs down.Through the creation of supplier hubs (supplier-managed distribution points) near Dell plants the company was able to limit the number of suppliers required globally. Technology department: Dell has developed e-services and 24/7 online support via their website www. dell. com.
More recently Dell is investing in network server technology and building partnerships. Human Resources Management: At Dell, HR is divided into Operations and Management. HR Operations coordinates transactional functions such as benefits, compensation and employee relations through a service centre.Staff members report directly up the chain through HR and rarely have contact with the core business units. HR management includes Dell University, the company’s education and training function, staffing and HR generalists who report to both the VP of a business unit and the VP of HR. Management deals with tactical rather than transactional issues. These HR employees attend the business unit’s staff meetings as consultants, develop the leadership team, produce matrices for such thing as turnover, productivity and cycle times and develop HR strategy for that particular line of business.
Firm’s infrastructure: Dell is a global company operating in 34 countries in 3 world regions with about 35,000 employees and $30 billion in sales. Dell is organised along geographic lines into the Americas, Asia-Pacific and Japan, and Europe/Middle East/Africa (EMEA). The corporate headquarters is located in Round Rock, Texas and is also the regional headquarters for Dell Americas. Each of these regions has its own regional headquarters and its own assembly plants and supplier networks. Regional headquarters include Bracknell, UK for EMEA, Hong Kong for Asia-Pacific and Kawasaki for Japan.Dell’s use of innovation and its effect on operations: Dell, one of the world’s leading providers of Technology has been using innovation throughout its business and as a result gained reputation and market share. Three examples of innovation which the company uses are as follows: Affordability of latest technology through Direct Sales Services – Dells uses direct customer relationship or as it calls it “customer intimacy” as its distribution strategy.
This means meeting customer needs directly and cutting middleman interference as much as possible.This is done either through dedicated sales representatives, telephone based sales and online at www. dell. com. As a result, the purchase price would be lower than other competitors. Dell provides different pricing for different budgets. The secret lies in the customer choice in selecting which parts s/he wants to have in the computer.
Does the customer want a simple or a more luxurious computer and how much is he willing to invest in this purchase. Customer Choice and Custom Tailored Services – Customers have various methods of purchasing and can choose whichever channel is appropriate for them.These include telephone, website and kiosk where they can examine, read reviews and check the price. Dell uses a “build-to-order” manufacturing process which will on average enable them to turn over inventory every five days. This will result in reduced inventory levels and bring the latest technology and design to customers at the lowest price. Customers have 24/7 support via the telephone or online whereby “any time, any problem can be consulted for”. There is also a 24 hour shipment service.
A customer not only can tailor to their needs but can have what they want within 24 hours.Relationship with the supplier – Dell has been a successful player in building relationships with its suppliers and uses a Just in Time (JIT) approach to inventory management with suppliers maintaining their own inventory. Once a customer orders parts for a computer, Dell notifies the supplier to deliver the parts to the factory for assembly. It was widely believed that customers would prefer buying the computers through traditional retail distribution methods. Dell tried this traditional approach and this resulted in its first ever big loss of ($36m) in 1993.As a result of this Dell reverted back to a direct business model approach. Dell’s combined market led and product led approaches Dell combines both market led and product led approaches together as a method of satisfying its customers to outperform its competitors.
Dell identifies its customers’ needs i. e. the type of PC’s they like and the features and specifications they are willing to pay for. This is achieved through online surveys, telephone and face to face market research in order to get a general idea of what to produce and then work with a target costing method to achieve its profits but also satisfying customers’ price.Having identified the customers’ needs the next step is to market and sell the final product. Examples of this approach are: Online 24/7 shopping and customer service – To create an excellent customer service above its competitors Dell offers 24hrs online service where customers can fill in questionnaires (customer feedback forms) at their convenience, view available products, custom build their PC’s with step by step specification pricing, quick and secure ordering process , regular special deals and financing options .Free one to one buying advice – With this service customers can either speak to or chat live with an adviser on a one to one basis and ascertain the available features relevant to the specification of their desired PC along with the benefits.
How Dell gained competitive advantage: With the rapid growth in technology during the 90’s Dell experienced a slump in 1993 whereby it was forced to re-evaluate their business model in order to try to recapture their market share and competitive advantage.Management’s objective was to focus on specific aspects of Dell’s business and identify how to bring about efficiency through cost savings and thereby increased profitability. Management recognised that by focusing on three areas profitability could be attained. Virtual integration: – In by-passing the retail distribution chain Dell significantly reduced costs by virtually eliminating inventory at each factory through its supplier relations and directly linking customers to the manufacturer. The aim being to cut costs and expedite delivery time with a more reliable value added finished product. E. .
Dell was able to trim the number of suppliers used from 204 to 47 in their Austin facility between 1995 and 1998. This led to the number of days a PC sat in inventory from 32 to 7 days. Dell’s target customer was the “knowledgeable PC user” who knew what they wanted. Dell focused on using a direct business model to target these customers. Real value customer service features: – Dell identified from past sales history and experience since inception that their customer base could be segmented and further targeted to identify their needs. They identified 2 categories – (1) Relationship buyers i. e.
arge businesses and institutions and (2) Transactional buyers i. e. small businesses and home PC users. Dell recognised that each category had specific needs. For example, the Relationship buyer was their larger customer base and required more assistance and so were assigned a representative to guide them through the buying experience whereas the Transactional buyer was offered online or telephone assistance. By integrating these categories into their customer service system repeat purchases were quick and easy, purchasing history could be consulted and follow up customer service was more effective.Tailoring manufacturing to customer needs: – Through integration Dell was able to link customers directly to the manufacturers.
Customers specific needs were met directly and a more efficient manufacturing process leading to final product completion time added value to the customer by way of a quicker delivery time. Suppliers were also eager to do business with Dell because their inventory levels rarely piled up and reduced their in-house costs. This all added value to Dell’s profitability.Analyse how Intel’s approach to R&D and manufacturing could be applied to Dell Although most research focuses on doing something new which is related to technological advancement, in the case of Dell this is different because it has identified customers with specific needs. Dell targets “knowledgeable PC users” and makes its task of providing a PC easier. Their research focused mainly on an operations management strategy of Just In Time (JIT) process to minimise stockholding costs and delivering the customer order on a timely basis i. e.
Economy.Also, Dell researched customer needs and then developed their website e-services (build a PC to order, delivery of PC, and 24/7 online and telephone customer service) i. e. Efficiency. In this way, Dell is able to meet its customers expectations with quality products within the shortest delivery time i. e. Effectiveness.
Intel’s approach to research focuses mainly on “manufacturing capabilities and materials” and “what technology can offer”. This relates to Dell in that they are using JIT to save costs and shorten lead-time through the management of supplier relationships.Technology has continually enabled Dell to develop their website to offer their customers flexibility.References: 1. Christopher, M ‘Dell Computers: Using the supply Chain to Compete’, Logistics and Supply chain Management (2nd Ed), Financial Times/Pitman Publishing, 1998, p. Unknown. 2.
Serwer, Andrew E, ‘Michael Dell turns the PC world inside out’, Fortune, 8 September 1997, pp. 38-44. 3. Author unknown, ‘Dells Competitive Advantage 81’, [Online]. Accessed 19 October 2010 from Google database http://google. com 4. Dell advert – Metro, 18th October 2010

Categories
Computer

History of Computer Virus

THE HISTORY OF COMPUTER VIRUSES A Bit of Archeology There are lots and lots of opinions on the date of birth of the first computer virus. I know for sure just that there were no viruses on the Babbidge machine, but the Univac 1108 and IBM 360/370 already had them (“Pervading Animal” and “Christmas tree”). Therefore the first virus was born in the very beginning of 1970s or even in the end of 1960s, although nobody was calling it a virus then. And with that consider the topic of the extinct fossil species closed. Journey’s Start Let’s talk of the latest history: “Brain”, “Vienna”, “Cascade”, etc.
Those who started using IBM PCs as far as in mid-80s might still remember the total epidemic of these viruses in 1987-1989. Letters were dropping from displays, crowds of users rushing towards monitor service people (unlike of these days, when hard disk drives die from old age but yet some unknown modern viruses are to blame). Their computers started playing a hymn called “Yankee Doodle”, but by then people were already clever, and nobody tried to fix their speakers – very soon it became clear that this problem wasn’t with the hardware, it was a virus, and not even a single one, more like a dozen.
And so viruses started infecting files. The “Brain” virus and bouncing ball of the “Ping-pong” virus marked the victory of viruses over the boot sector. IBM PC users of course didn’t like all that at all. And so there appeared antidotes. Which was the first? I don’t know, there were many of them. Only few of them are still alive, and all of these anti-viruses did grow from single project up to the major software companies playing big roles on the software market. There is also an notable difference in conquering different countries by viruses.

The first vastly spread virus in the West was a bootable one called “Brain”, the “Vienna” and “Cascade” file viruses appeared later. Unlike that in East Europe and Russia file viruses came first followed by bootable ones a year later. Time went on, viruses multiplied. They all were all alike in a sense, tried to get to RAM, stuck to files and sectors, periodically killing files, diskettes and hard disks. One of the first “revelations” was the “Frodo. 4096” virus, which is far as I know was the first invisible virus (Stealth).
This virus intercepted INT 21h, and during DOS calls to the infected files it changed the information so that the file appeared to the user uninfected. But this was just an overhead over MS-DOS. In less than a year electronic bugs attacked the DOS kernel (“Beast. 512” Stealth virus). The idea of in visibility continued to bear its fruits: in summer of 1991 there was a plague of “Dir_II”. “Yeah! “, said everyone who dug into it. But it was pretty easy to fight the Stealth ones: once you clean RAM, you may stop worrying and just search for the beast and cure it to your hearts content.
Other, self encrypting viruses, sometimes appearing in software collections, were more troublesome. This is because to identify and delete them it was necessary to write special subroutines, debug them. But then nobody paid attention to it, until … Until the new generation of viruses came, those called polymorphic viruses. These viruses use another approach to invisibility: they encrypt themselves (in most cases), and to decrypt themselves later they use commands which may and may not be repeated in different infected files.
Polymorphism – Viral Mutation The first polymorphic virus called “Chameleon” became known in the early ’90s, but the problem with polymorphic viruses became really serious only a year after that, in April 1991, with the worldwide epidemic of the polymorphic virus “Tequila” (as far as I know Russia was untouched by the epidemic; the first epidemic in Russia, caused by a polymorphic virus, happened as late as in 1994, in three years, the virus was called “Phantom1”).
The idea of self encrypting polymorphic viruses gained popularity and brought to life generators of polymorphic code – in early 1992 the famous “Dedicated” virus appears, based on the first known polymorphic generator MtE and the first in a series of MtE-viruses; shortly after that there appears the polymorphic generator itself. It is essentially an object module (OBJ file), and now to get a polymorphic mutant virus from a conventional non-encrypting virus it is sufficient to simply link their object modules together – the polymorphic OBJ file and the virus OBJ file.
Now to create a real polymorphic virus one doesn’t have to dwell on the code of his own encryptor/decryptor. He may now connect the polymorphic generator to his virus and call it from the code of the virus when desired. Luckily the first MtE-virus wasn’t spread and did not cause epidemics. In their turn the anti-virus developers had sometime in store to prepare for the new attack. In just a year production of polymorphic viruses becomes a “trade”, followed by their “avalanche” in 1993. Among the viruses coming to my collection the volume of polymorphic viruses increases.
It seems that one of the main directions in this uneasy job of creating new viruses becomes creation and debugging of polymorphic mechanism, the authors of viruses compete not in creating the toughest virus but the toughest polymorphic mechanism instead. This is a partial list of the viruses that can be called 100 percent polymorphic (late 1993): Bootache, CivilWar (four versions), Crusher, Dudley, Fly, Freddy, Ginger, Grog, Haifa, Moctezuma (two versions), MVF, Necros, Nukehard, PcFly (three versions), Predator, Satanbug, Sandra, Shoker, Todor, Tremor, Trigger, Uruguay (eight versions).
These viruses require special methods of detection, including emulation of the viruses executable code, mathematical algorithms of restoring parts of the code and data in virus etc. Ten more new viruses may be considered non-100 percent polymorphic (that is they do encrypt themselves but in decryption routine there always exist some nonchanging bytes): Basilisk, Daemaen, Invisible (two versions), Mirea (several versions), Rasek (three versions), Sarov, Scoundrel, Seat, Silly, Simulation. However to detect them and to restore the infected objects code decrypting is still required, because the length of nonchanging code in the decryption outine of those viruses is too small. Polymorphic generators are also being developed together with polymorphic viruses. Several new ones appear utilizing more complex methods of generating polymorphic code. They become widely spread over the bulletin board systems as archives containing object modules, documentation and examples of use. By the end of 1993 there are seven known generators of polymorphic code. They are: MTE 0. 90 (Mutation Engine), TPE (Trident Polymorphic Engine), four versions NED (Nuke Encryption Device), DAME (Dark Angel’s Multiple Encryptor)
Since then every year brought several new polymorphic generators, so there is little sense in publishing the entire lists. Automating Production and Viral Construction Sets Laziness is the moving force of progress (to construct the wheel because that’s too lazy to carry mammoths to the cave). This traditional wisdom needs no comments. But only in the middle of 1992 progress in the form of automating production touched the world of viruses. On the fifth of July 1992 the first viral code construction set for IBM PC compatibles called VCL (Virus Creation Laboratory) version 1. 00 is declared for production and shipping.
This set allows to generate well commented source texts of viruses in the form or assembly language texts, object modules and infected files themselves. VCL uses standard windowed interface. With the help of a menu system one can choose virus type, objects to infect (COM or/and EXE), presence or absence of self encryption, measures of protection from debugging, inside text strings, optional 10 additional effects etc. Viruses can use standard method of infecting a file by adding their body to the end of file, or replace files with their body destroying the original content of a file, or become companion viruses.
And then it became much easier to do wrong: if you want somebody to have some computer trouble just run VCL and within 10 to 15 minutes you have 30-40 different viruses you may then run on computers of your enemies. A virus to every computer! The further the better. On the 27th of July the first version of PS-MPC (Phalcon/Skism Mass-Produced Code Generator). This set does not have windowed interface, it uses configuration file to generate viral source code.
This file contains description of the virus: the type of infected files (COM or EXE); resident capabilities (unlike VCL, PS-MPC can also produce resident viruses); method of installing the resident copy of the virus; self encryption capabilities; the ability to infect COMMAND. COM and lots of other useful information. Another construction set G2 (Phalcon/Skism’s G2 0. 70 beta) has been created. It supported PS-MPC configuration files, however allowing much more options when coding the same functions. The version of G2 I have is dated the first of January 1993.
Apparently the authors of G2 spent the New Year’s Eve in front of their computers. They’d better have some champagne instead, this wouldn’t hurt anyway. So in what way did the virus construction sets influence electronic wildlife? In my virus collection there are: • several hundreds of VCL and G2 based viruses; • over a thousand PS-MPC based viruses. So we have another tendency in development of computer viruses: the increasing number of “construction set” viruses; more unconcealably lazy people join the ranks of virus makers, downgrading a respectable and creative profession of creating viruses to a mundane rough trade.
Outside DOS The year 1992 brought more than polymorphic viruses and virus construction sets. The end of the year saw the first virus for Windows, which thus opened a new page in the history of virus making. Being small (less than 1K in size) and absolutely harmless this non resident virus quite proficiently infected executables of new Windows format (NewEXE); a window into the world of Windows was opened with its appearance on the scene. After some time there appeared viruses for OS/2, and January 1996 brought the first Windows95 virus.
Presently not a single week goes by without new viruses infecting non-DOS systems; possibly the problem of non-DOS viruses will soon become more important than the problem of DOS viruses. Most likely the process of changing priorities will resemble the process of DOS dying and new operating systems gaining strength together with their specific programs. As soon as all the existing software for DOS will be replaced by their Windows, Windows95 and OS/2 analogues, the problem of DOS viruses becomes nonexistent and purely theoretical for computer society. The first attempt to create a virus working in 386 protected mode was also made in 1993.
It was a boot virus “PMBS” named after a text string in its body. After boot up from infected drive this virus switched to protected mode, made itself supervisor and then loaded DOS in virtual window mode V86. Luckily this virus was born dead – its second generation refused to propagate due to several errors in the code. Besides that the infected system “hanged” if some of the programs tried to reach outside the V86 mode, for example to determine the presence of extended memory. This unsuccessful attempt to create supervisor virus remained the only one up to spring of 1997, when one Moscow prodigy released “PM.
Wanderer” – a quite successful implementation of a protected mode virus. It is unclear now whether those supervisor viruses might present a real problem for users and anti-virus program developers in the future. Most likely not because such viruses must “go to sleep” while new operating systems (Windows 3. xx, Windows95/NT, OS/2) are up and running, allowing for easy detection and killing of the virus. But a full-scale stealth supervisor virus may mean a lot of trouble for “pure” DOS users, because it is absolutely impossible to detect such a stealth virus under pure DOS. Macro Virus Epidemics
August 1995. All the progressive humanity, The Microsoft and Bill Gates personally celebrate the release of a new operating system Windows95. With all that noise the message about a new virus using basically new methods of infection came virtually unnoticed. The virus infected Microsoft Word documents. Frankly it wasn’t the first virus infecting Word documents. Earlier before anti-virus companies had the first experimental example of a virus on their hands, which copied itself from one document to another. However nobody paid serious attention to that not quite successful experiment.
As a result virtually all the anti-virus companies appeared not ready to what came next – macro virus epidemics – and started to work out quick but inadequate steps in order to put an end to it. For example several companies almost simultaneously released documents- anti-viruses, acting along about the same lines as did the virus, but destroying it instead of propagation. By the way it became necessary to correct anti-virus literature in a hurry because earlier the question, “Is it possible to infect a computer by simply reading a file” had been answered by a definite “No way! with lengthy proofs of that. As for the virus which by that time got its name, “Concept”, continued its ride of victory over the planet. Having most probably been released in some division of Microsoft “Concept” ran over thousands if not millions of computers in no time it all. It’s not unusual, because text exchange in the format of Microsoft Word became in fact one of the industry standards, and to get infected by the virus it is sufficient just to open the infected document, then all the documents edited by infected copy of Word became infected too.
As a result having received an infected file over the Internet and opened it, the unsuspecting user became “infection peddler”, and if his correspondence was made with the help of MS Word, it also became infected! Therefore the possibility of infecting MS Word multiplied by the speed of Internet became one of the most serious problems in all the history of existence of computer viruses. In less than a year, sometime in summer of 1996, there appeared the “Laroux” virus, infecting Microsoft Excel spreadsheets. As it had been with “Concept”, these new virus was discovered almost simultaneously in several companies.
The same 1996 witnessed the first macro virus construction sets, then in the beginning of 1997 came the first polymorphic macro viruses for MS Word and the first viruses for Microsoft Office97. The number of various macro viruses also increased steadily reaching several hundreds by the summer of 1997. Macro viruses, which have opened a new page in August 1995, using all the experience in virus making accumulated for almost 10 years of continuous work and enhancements, actually do present the biggest problem for modern virology.
Chronology of Events It’s time to give a more detailed description of events. Let’s start from the very beginning. Late 1960s – early 1970s Periodically on the mainframes at that period of time there appeared programs called “the rabbit”. These programs cloned themselves, occupied system resources, thus lowering the productivity of the system. Most probably “rabbits” did not copy themselves from system to system and were strictly local phenomena – mistakes or pranks by system programmers servicing these computers.
The first incident which may be well called an epidemic of “a computer virus”, happened on the Univax 1108 system. The virus called “Pervading Animal” merged itself to the end of executable files – virtually did the same thing as thousands of modern viruses do. The first half of 1970s “The Creeper” virus created under the Tenex operating system used global computer networks to spread itself. The virus was capable of entering a network by itself by modem and transfer a copy of itself to remote system. “The Reeper” anti-virus program was created to fight this virus, it was the first known anti-virus program.
Early 1980s Computers become more and more popular. An increasing number of program appears written not by software companies but by private persons, moreover, these programs may be freely distributed and exchanged through general access servers – BBS. As a result there appears a huge number of miscellaneous “Trojan horses”, programs, doing some kind of harm to the system when started. 1981 “Elk Cloner” bootable virus epidemics started on Apple II computers. The virus attached itself to the boot sector of diskettes to which there were calls.
It showed itself in many ways – turned over the display, made text displays blink and showed various messages. 1986 The first IBM PC virus “Brain” pandemic began. This virus infecting 360 KB diskettes became spread over the world almost momentarily. The secret of a “success” like this late probably in total unpreparedness of computer society to such a phenomenon as computer virus. The virus was created in Pakistan by brothers Basit and Amjad Farooq Alvi. They left a text message inside the virus with their name, address and telephone number.
According to the authors of the virus they were software vendors, and would like to know the extent of piracy in their country. Unfortunately their experiment left the borders of Pakistan. It is also interesting that the “Brain” virus was the first stealth virus, too – if there was an attempt to read the infected sector, the virus substituted it with a clean original one. Also in 1986 a programmer named Ralph Burger found out that a program can create copies of itself by adding its code to DOS executables. His first virus called “VirDem” was the demonstration of such a capability.
This virus was announced in December 1986 at an underground computer forum, which consisted of hackers, specializing at that time on cracking VAX/VMS systems (Chaos Computer Club in Hamburg). 1987 “Vienna” virus appears. Ralph Burger, whom we already now, gets a copy of this virus, disassembles it, and publishes the result in his book “Computer Viruses: a High-tech Disease”. Burger’s book made the idea of writing viruses popular, explained how to do it, and therefore stimulated creating up hundreds and in thousands of computer viruses, in which some of the ideas from his book were implemented.
Some more IBM PC viruses are being written independently in the same year. They are: “Lehigh”, infecting the COMMAND. COM file only; “Suriv-1” a. k. a. “April1st”, infecting COM files; “Suriv-2”, infecting (for the first time ever) EXE files; and “Suriv-3”, infecting both COM and EXE files. There also appear several boot viruses (“Yale” in USA, “Stoned” in New Zealand, “PingPong” in Italy), and the first self encrypting file virus “Cascade”. Non-IBM computers are also not forgotten: several viruses for Apple Macintosh, Commodore Amiga and Atari ST have been detected.
In December of 1987 there was the first total epidemics of a network virus called “Christmas Tree”, written in REXX language and spreading itself under the VM/CMS operating environments. On the ninth of December this virus was introduced into the Bitnet network in one of West German universities, then via gateway it got into the European Academic Research Network (EARN) and then into the IBM Vnet. In four days (Dec. 13) the virus paralyzed the network, which was overflowing with copies of it (see the desk clerk example several pages earlier).
On start-up the virus output an image of the Christmas tree and then sent copies of itself to all the network users whose addresses were in the corresponding system files NAMES and NETLOG. 1988 On Friday the 13 1988 several companies and universities in many countries of the world “got acquainted” with the “Jerusalem” virus. On that day the virus was destroying files which were attempted to be run. Probably this is one of the first MS-DOS viruses which caused a real pandemic, there were news about infected computers from Europe, America and the Middle East.
Incidentally the virus got its name after one of the places it stroke – the Jerusalem University. “Jerusalem” together with several other viruses (“Cascade”, “Stoned”, “Vienna”) infected thousands of computers still being unnoticed – anti-virus programs were not as common then as they are now, many users and even professionals did not believe in the existence of computer viruses. It is notable that in the same year the legendary computer guru Peter Norton announced that computer viruses did not exist. He declared them to be a myth of the same kind as alligators in New York sewers.
Nevertheless this delusion did not prevent Symantec from starting its own anti-virus project Norton Anti-virus after some time. Notoriously false messages about new computer viruses started to appear, causing panic among the computer users. One of the first virus hoaxes of this kind belongs to a Mike RoChenle (pronounced very much like “Microchannel”), who uploaded a lot of messages to the BBS systems, describing the supposed virus copying itself from one BBS to another via modem using speed 2400 baud for that. Funny as it may seem many users gave up 2000 baud standard of that time and lowered the speed of their modems to 1200 baud.
Similar hoaxes appeared even now. The most famous of them so far are GoodTimes and Aol4Free. November 1988: a total epidemic of a network virus of Morris (a. k. a. Internet Worm). This virus infected more than 6000 computer systems in USA (including NASA research Institute) and practically paralyzed their work. Because of erratic code of the virus it sent unlimited copies of itself to other network computers, like the “Christmas Tree” worm virus, and for that reason completely paralyzed all the network resources. Total losses caused by the Morris virus were estimated at 96 millions of dollars.
This virus used errors in operating systems Unix for VAX and Sun Microsystems to propagate. Besides the errors in Unix the virus utilized several more original ideas, for example picking up user passwords. A more detailed story of this virus and the corresponding incidents may be found in a rather detailed and interesting articles. December 1988: the season of worm viruses continues this time in DECNet. Worm virus called HI. COM output and image of spruce and informed users that they should “stop computing and have a good time at home!!! There also appeared new anti-virus programs for example, Doctors Solomon’s Anti-virus Toolkit, being one of the most powerful anti-virus software presently. 1989 New viruses “Datacrime”, “FuManchu” appear, as do the whole families like “Vacsina” and “Yankee”. The first one acted extremely dangerously – from October 13th to December 31st it formatted hard disks. This virus “broke free” and caused total hysteria in the mass media in Holland and Great Britain. September 1989: 1 more anti-virus program begins shipping – IBM Anti-virus. October 1989: one more epidemic in DECNet, this time it was worm virus called “WANK Worm”.
December 1989: an incident with a “Trojan horse” called “AIDS”. 20,000 copies were shipped on diskettes marked as “AIDS Information Diskette Version 2. 0”. After 90 boot-ups the “Trojan” program encrypted all the filenames on the disk, making them invisible (setting a “hidden” attribute) and left only one file readable – bill for $189 payable to the address P. O. Box 7, Panama. The author of this program was apprehended and sent to jail. One should note that in 1989 there began total epidemics of computer viruses in Russia, caused by the same “Cascade”, “Jerusalem” and “Vienna”, which besieged the computers of Russian users.
Luckily Russian programmers pretty quickly discovered the principles of their work, and virtually immediately there appeared several domestic anti-viruses, and AVP (named “-V”) those time, was one of them. My first acquaintance with viruses (this was the “Cascade” virus) replaced in the world 1989 when I found virus on my office computer. This particular fact influenced my decision to change careers and create anti-virus programs. In a month the second incident (“Vacsina” virus) was closed with a help of the first version of my anti-virus “-V” (minus-virus), several years later renamed to AVP – AntiViral Toolkit Pro.
By the end of 1989 several dozens of viruses herded on Russian lands. They were in order of appearance: two versions of “Cascade”, several “Vacsina” and “Yankee” viruses, “Jerusalem”, “Vienna”, “Eddie”, “PingPong”. 1990 This year brought several notable events. The first one was the appearance of the first polymorphic viruses “Chameleon” (a. k. a. “V2P1”, “V2P2”, and “V2P6”). Until then the anti-virus programs used “masks” – fragments of virus code – to look for viruses. After “Chameleon”‘s appearance anti-virus program developers had to look for different methods of virus detection.
The second event was the appearance of Bulgarian “virus production factory”: enormous amounts of new viruses were created in Bulgaria. Disease wears the entire families of viruses “Murphy”, “Nomenclatura”, “Beast” (or “512”, “Number-of-Beast”), the modifications of the “Eddie” virus etc. A certain Dark Avenger became extremely active, making several new viruses a year, utilizing fundamentally new algorithms of infecting and covering of the tracks in the system. It was also in Bulgaria that the first BBS opens, dedicated to exchange of virus code and information for virus makers.
In July 1990 there was an incident with “PC Today” computer magazine (Great Britain). It contained a floppy disk infected with “DiskKiller” virus. More than 50,000 copies were sold. In the second half of 1990 there appeared two Stealth monsters – “Frodo” and “Whale”. Both viruses utilized extremely complicated stealth algorithms; on top of that the 9KB “Whale” used several levels of encrypting and anti-debugging techniques. 1991 Computer virus population grows continuously, reaching several hundreds now.
Anti-viruses also show increasing activity: two software monsters at once (Symantec and Central Point) issue their own anti-virus programs – Norton Anti-virus and Central Point Anti-virus. They are followed by less known anti-viruses from Xtree and Fifth Generation. In April a full-scale epidemic broke out, caused by file and boot polymorphic virus called “Tequila”, and in September the same kind of story happened with “Amoeba” virus. Summer of 1991: “Dir_II” epidemic. It was a link virus using fundamentally new methods of infecting files. 1992
Non-IBM PC and non-MS-DOS viruses are virtually forgotten: “holes” in global access network are closed, errors corrected, and network worm viruses lost the ability to spread themselves. File-, boot- and file-boot viruses for the most widely spread operating system (MS-DOS) on the most popular computer model (IBM PC) are becoming more and more important. The number of viruses increases in geometrical to progression; various virus incidents happen almost every day. Miscellaneous anti-virus programs are being developed, dozens of books and several periodic magazines on anti-viruses are being printed.
A few things stand out: Early 1992: the first polymorphic generator MtE, serving as a base for several polymorphic viruses which follow almost immediately. Mte was also the prototype for a few forthcoming polymorphic generators. March 1992: “Michelangelo” virus epidemics (a. k. a. “March6”) and the following hysteria took place. Probably this is the first known case when anti-virus companies made fuss about this virus not to protect users from any kind of danger, but attract attention to their product, that is to create profits.
One American anti-virus company actually announced that on the 6th of March the information on over five million computers will be destroyed. As a result of the fuss after that the profits of different anti-virus companies jumped several times; in reality only about 10,000 computers suffered from that virus. July 1992: The first virus construction sets were made, VCL and PS-MPC. They made large flow of new viruses even larger. They also stimulated virus makers to create other, more powerful, construction sets, as it was done by MtE in its area.
Late 1992: The first Windows virus appears, infecting this OS’s executables, and starts a new page in virus making. 1993 Virus makers are starting to do some serious damage: besides hundreds of mundane viruses which are no different than their counterparts, besides the whole polymorphic generators and construction sets, besides new electronic editions of virus makers there appear more and more viruses, using highly unusual ways of infecting files, introducing themselves into the system etc. The main examples are: “PMBS”, wording in Intel 80386 protected mode. Strange” (or “Hmm”) – a “masterpiece” of Stealth technology, however fulfilled on the level of hardware interrupts INT 0Dh and INT 76h. “Shadowgard” and “Carbunkle”, which widened debt range of algorithms of companion viruses. “Emmie”, “Metallica”, “Bomber”, “Uruguay” and “Cruncher” – the use of fundamentally new techniques of “hiding” of its own code inside the infected files. In spring of 1993 Microsoft made its own anti-virus MSAV, based on CPAV by Central Point. 1994 The problem of CD viruses is getting more important. Having quickly gained popularity CD disks became one of the main means of spreading viruses.
There are several simultaneous cases when a virus got to the master disk when preparing the batch CDs. As a result of that a fairly large number (tens of thousands) of infected CDs hit the market. Of course they cannot be cured, they just have to be destroyed. Early in the year in Great Britain there popped out two extremely complicated polymorphic viruses, “SMEG. Pathogen” and “SMEG. Queeg” (even now not all the anti-virus programs are able to give 100% correct detection of these viruses). Their author placed infected files to a BBS, causing real panic and fear of epidemics in mass media.
Another wave of panic was created by a message about a supposed virus called “GoodTimes”, spreading via the Internet and infecting a computer when receiving E-mail. No such virus really existed, but after some time there appeared a usual DOS virus containing text string “Good Times”. It was called “GT-Spoof”. Law enforcement increases its activities: in Summer of 1994 the author of SMEG was “sorted out” and arrested. Approximately at the same time also in Great Britain there was arrested an entire group of virus makers, who called themselves ARCV (Association for Really Cruel Viruses).
Some time later one more author of viruses was arrested in Norway. There appear some new unusual enough viruses: January 1994: “Shifter” – the first virus infecting object modules (OBJ files). “Phantom1” – the cause of the first epidemic of polymorphic virus in Moscow. April 1994: “SrcVir” — the virus family infecting program source code (C and Pascal). June 1994: “OneHalf” – one of the most popular viruses in Russia so far starts a total epidemics. September 1994: “3APA3A” – a boot-file virus epidemic. This virus uses a highly unusual way of incorporating into MS-DOS.
No anti-virus was ready to meet such kind of a monster. In 1994 (Spring) one of the anti-virus leaders of that time – Central Point – ceased to exist, acquired by Symantec, which by that time managed to “swallow” several minor companies, working on anti- viruses – Peter Norton Computing, Cetus International and Fifth Generation Systems. 1995 Nothing in particular among DOS viruses happens, although there appear several complicated enough monster viruses like “NightFall”, “Nostardamus”, “Nutcracker”, also some funny viruses like “bisexual” virus “RMNS” and BAT virus “Winstart”.
The “ByWay” and “DieHard2” viruses become widespread, with news about infected computers coming from all over the world. February 1995: an incident with Microsoft: Windows95 demos disks are infected by “Form”. Copies of these disks were sent to beta testers by Microsoft; one of the testers was not that lazy and tested the disks for viruses. Spring 1995: two anti-virus companies – ESaSS (ThunderBYTE anti-virus) and Norman Data Defense (Norman Virus Control) announce their alliance. These companies, each making powerful enough anti- viruses, joined efforts and started working on a joint anti-virus system.
August 1995: one of the turning points in the history of viruses and anti-viruses: there has actually appeared the first “alive” virus for Microsoft Word (“Concept”). In some month the virus “tripped around the world”, pesting the computers of the MS Word users and becoming a firm No. 1 in statistic research held by various computer titles. 1996 January 1996: two notable events – the appearance of the first Windows95 virus (“Win95. Boza”) and the epidemics of the extremely complicated polymorphic virus “Zhengxi” in St. Petersburg (Russia). March 1996: the first Windows 3. virus epidemic. The name of the virus is “Win. Tentacle”. This virus infected a computer network a hospital and in several other institutions in France. This event is especially interesting because this was the FIRST Windows virus on a spree. Before that time (as far as I know) all the Windows viruses had been living only in collections and electronic magazines of virus makers, only boot viruses, DOS viruses and macro viruses were known to ride free. June 1996: “OS2. AEP” – the first virus for OS/2, correctly infecting EXE files of this operating system.
Earlier under OS/2 there existed only the viruses writing themselves instead of file, destroying it or acting as companions. July 1996: “Laroux” – the first virus for Microsoft Excel caught live (originally at the same time in two oil making companies in Alaska and in southern African Republic). The idea of “Laroux”, like that of Microsoft Word viruses, was based on the presence of so-called macros (or Basic programs) in the files. Such programs can be included into both electronic spreadsheets of Microsoft Excel and Microsoft Word documents.
As it turned out the Basic language built into Microsoft Excel also allows to create viruses. December 1996: “Win95. Punch” – the first “memory resident” virus for Windows95. It stays in the Windows memory as a VxD driver, hooks file access and infects Windows EXE files that are opened. In general the year 1996 is the start of widespread virus intervention into the Windows32 operating system (Windows95 and WindowsNT) and into the Microfoft Office applications. During this and the next year several dozens of Windows viruses and several hunsdreds of macro viruses appeared.
Many of them used new technologies and methods of infection, including stealth and polymorphic abilities. That was the next round of virus evolution. During two years they repeated the way of improving similar to DOS viruses. Step by step they started to use the same features that DOS viruses did 10 years beforehand, but on next technological level. 1997 February 1997: “Linux. Bliss” – the first virus for Linux (a Unix clone). This way viruses occupied one more “biological” niche. February-April 1997: macro viruses migrated to Office97.
The first of them turned out to be only “converted” to the format macro viruses for Microsoft Word 6/7, but also virtually immediately there appeared viruses aimed at Office97 documents exclusively. March 1997: “ShareFun” – macro-virus hitting Microsoft Word 6/7. It uses is not only standard features of Microsoft Word to propagate but also sends copies of itself via MS-Mail. April 1997: “Homer” – the first network worm virus, using File Transfer Protocol (FTP) for propagation. June 1997: There appears the first self encrypting virus for Windows95. This virus of Russian origin has been sent to several BBS is in Moscow which caused an epidemic.
November 1997: The “Esperanto” virus. This is the first virus that intends to infect not only DOS and Windows32 executable files, but also spreads into the Mac OS (Macintosh). Fortunately, the virus is not able to spread cross the platforms because of bugs. December 1997: new virus type, the so-called “mIRC Worms”, came into being. The most popular Windows Internet Relay Chat (IRC) utility known as mIRC proved to be “hole” allowing virus scripts to transmit themselves along the IRC-channels. The next IRC version blocked the hole and the mIRC Worms vanished. The KAMI ltd. nti-virus department has braked away from the mother company constituting the independent one what, certainly, is considered the main event of 1997. Currently the company known as Kaspersky Labs and proved to be a recognized leader of the anti-virus industry. Since 1994 the AntiViral Toolkit Pro (AVP) anti-virus scanner, main product of the company, constantly shows high results while being tested by various test laboratories of all world. Creation of an independent company gave the chance to the at first small group of developers to gain the lead on the domestic market and prominence on the world one.
For short run versions for practically all popular platforms were developed and released, the new anti-virus solutions offered, the international distribution and the product support networks created. October 1997: the agreement on licensing of AVP technologies use in F-Secure Anti-Virus (FSAV) was signed. The F-Secure Anti-Virus (FSAV) package was the DataFellows (Finland) new anti-virus product. Before DataFellows was known as the F-PROT anti-virus package manufacturer. 1997 was also the year of several scandals between the anti-virus main manufacturers in US and Europe.
At the year beginning McAfee has announced that its experts have detected a “feature” in the antivirus programs of Dr. Solomon, one of its main competitors. The McAfee testimony stated that if the Dr. Solomon’s antivirus while scanning detects several virus-types the program switches to the advanced scanning mode. What means that while scanning some uninfected computer the Dr. Solomon’s anti-virus operates in the usual mode and switches to the advanced mode – “cheat mode” according to McAfee – enabling the application to detect the invisible for the usual mode viruses while testing virus collections.
Consequently the Dr. Solomon’s anti-virus shows both good speed while scanning uninfected disks and good virus detection ability while scanning virus collections. A bit later Dr. Solomon stroked back accusing McAfee of the incorrect advertising campaign. The claims were raised to the text – “The Number One Choice Worldwide. No Wonder The Doctor’s Left Town”. At the same time McAfee was in the court together with Trend Micro, another antivirus software manufacturer, concerning the Internet and e-mail data scanning technology patent violation.
Symantec also turned out to be involved in the cause and accused McAfee of using the Symantec codes in the McAfee products. And etc. The year completion by one more noteworthy event related to McAfee-name was marked – McAfee Associates and Network General have declared consolidation into the new born Network Associates company and positioning of their services not only on the anti-virus protection software market, but also on the markets of computer safety universal systems, encryption and network administration. From this the virus and anti-virus history point McAfee would correspond to NAI. 998 The virus attack on MS Windows, MS Office and the network applications does not weaken. There arose new viruses employing still more complex strokes while infecting computers and advanced methods of network-to-computer penetration. Besides numerous the so-called Trojans, stealing Internet access passwords, and several kinds of the latent administration utilities came into the computer world. Several incidents with the infected CDs were revealed – Some computer media publishers distributed CIH and Marburg (the Windows viruses) through CDs attached to the covers of their issues, with infected.
The year beginning: Epidemic of the “Win32. HLLP. DeTroie” virus family, not just infecting Windows32 executed files but also capable to transmit to the “owner” the information on the computer that was infected, shocked the computer world. As the viruses used specific libraries attached only to the French version of Windows, the epidemic has affected just the French speaking countries. February 1998: One more virus type infecting the Excel tables “Excel4. Paix” (aka “Formula. Paix) was detected.
This type of a macro virus while rooting into the Excel tables does not employ the usual for the kind of viruses macro area but formulas that proved to be capable of the self-reproduction code accommodation. February – March 1998: “Win95. HPS” and “Win95. Marburg” – the first polymorphous Windows32-viruses were detected and furthermore they were “in-the-wild”. The anti-virus programs developers had nothing to do but rush to adjust the polymorphous viruses detecting technique, designed so far just for DOS-viruses, to the new conditions.
March 1998: “AccessiV” – the first Microsoft Access virus was born. There was no any boom about that (as it was with “Word. Concept” and “Excel. Laroux” viruses) as the computer society already got used to that the MS Office applications go down thick and fast. March 1998: The “Cross” macro-virus, the first virus infecting two different MS Office applications – Access and Word, is detected. Hereupon several more viruses transferring their codes from one MS Office application to the other have emerged.
May 1998 – The “RedTeam” virus infects Windows EXE-files and dispatches the infected files through Eudora e-mail. June 1998 – The “Win95. CIH” virus epidemic at the beginning was mass, then became global and then turned to a kind of computer holocaust – quantity of messages on computer networks and home personal computers infection came to the value of hundreds if not thousands pierces. The epidemic beginning was registered in Taiwan where some unknown hacker mailed the infected files to local Internet conferences.
Therefrom virus has made the way to USA where through the staff oversight infected at once several popular Web servers that started to distribute infected game programs. Most likely these infected files on game servers brought about this computer holocaust that dominated the computer world all the year. According to the “popularity” ratings the virus pushed “Word. CAP” and “Excel. Laroux” to second cabin. One should also pay attention to the virus dangerous manifestation – depending on the current date the virus erased Flash BIOS what in some conditions could kill motherboard.
August 1998: Nascence of the sensational “BackOrifice” (“Backdoor. BO”) – utility of latent (hacker’s) management of remote computers and networks. After “BackOrifice” some other similar programs – “NetBus”, “Phase” and other – came into being. Also in August the first virus infecting the Java executed files – “Java. StangeBrew” – was born. The virus was not any danger to the Internet users as there was no way to employ critical for the virus replication functions on any remote computer. However it revealed that even the Web servers browsers could be attacked by viruses.
November 1998: “VBScript. Rabbit” – The Internet expansion of computer parasites proceeded by three viruses infecting VisualBasic scripts (VBS files), which being actively used in Web pages development. As the logical consequence of VBScript-viruses the full value HTML-virus (“HTML. Internal”) was born to life. Virus-writers obviously turned their efforts to the network applications and to the creation of full value Network Worm-Virus that could employ the MS Windows and Office options, infect remote computers and Web-servers or/and could aggressively replicate itself through e-mail.
The anti-virus manufacturers world was also considerably rearranged. In May 1998 Symantec and IBM announced the union of their forces on the anti-virus market. The collective product would be under the Norton Anti-Virus trade mark distributed and the IBM Anti-Virus (IBMAV) program is liquidated. Response of the main competitors, Dr. Solomon and NAI (former McAfee), followed immediately. They issued the press-releases offering the IBM product users to promotionally replace the dead anti-virus with their own products. Less then one month later Dr. Solomon “committed suicide”. The

Categories
Computer

Computerized Enrollements System

INTRODUCTION The slow processing of enrollment is a levy issued for all schools. All schools like colleges, secondary, and elementary school uses registration form for the enrollees. The students will sign and full-up all the necessary information in order to have their personal file in the school registrar. The manual registration form is the first and common used for enrollment registration. This kind of processing of enrolment is wasting time. Students spend a lot of time to fall in line for how many hours to complete their requirements during enrollments. Most of time enrollees are go and fort between different offices, they need to fall n line for how many hours to get a lot of enrollment form and fill-up and sign it. From the registrar to the their respective dean, for the schedule of their subjects, for their assessment and to the accounting office. This kind of manual form of enrollment can be very wasting a lot of time. Acquiring and processing documents for assessment takes a lot of time to process. This is because; documents are manually acquired, evaluated, and processed. Slow processing of assessment is also aggravated when documents from persons file is lost or misplaced since they only used file cabinet as storage of data and no back-up strategies is available.
Inconsistency of data is encountered in locating file, because enrollment for mare manually written, and manually computed for their assessment. Enrollment form data and checked manually for accuracy, HR staff chasing down employees to determine who has compute the process, benefit calculation by hand, manual entry of enrollment information into and carrier site or manual entry of deductions amount into the payroll system, and paper work filled and store. The paper –based manual method of open enrollment also incurs a soft cost that cannot measured. Employee dissatisfaction. Employees today are not happy with the level of ommunication they receive from traditional open enrollment. Most schools, businesses or establishments facing the same kind of problems, most of them sought a solution to help them aid these problems, to offer the best possible services to their enrollees or clients. Through the advancement of technology, they’ve used various software application to computerized some processes but lack features that fit their needs. In this regards, most of them sought to have customized information system. Nowadays computer serves as an important role in our society, most especially in school remise. It works easier and faster. It lessen errors and work by using machines. It reduces costs to an organization from paper work-up to computerized working system. A system designed to perform the process involved in registration, advertizing, assessments, and payments of the students as well as scheduling of classes. A new automated enrollment system was proposed and passed by the administration just recently. According to the chair person of the department of Information Technology and System, this enrollment system will push through on the first semester of School Year 2009- 010. According to the source: the Computerizes Enrollment System will automatically get the students subject/section for hassle-free enrollment. This is for student without any pending back subject while for students with pending back subject, they still need to meet their department chair/coordinators first for the advising of subjects before using the enrolment system. The system design project, system that will provide the needed and storing information in a faster, more convenient way by storing file of the student enrollees in a computer system that ill lessen the effort of faculty staff and storing files of each students every now and then. This will also serve as information especially for the irregular students, freshmen, transferee, returnee and professor in able to get access in course. Subject, professor, and students enrollees. This information where can be viewed in just a second without worrying that a single file lost. The idea behind enrolment system is not a new concept. As a student’s enrollees increase every year enrolment procedure become harder to deal. This will only serve to increase the problem facing enrollment that provides more easy way in enrolling.
This will be a big help to all the enrolment staff especially to those under the computer department because they are the one who are entitled to touch and read the information from here. It will help our institution to have another system that will upgrade the enrollment processes so as to meet the quality that our institutions are trying to meet. October 1st will see the biggest change in the UK pension system since in inception as auto-enrolment begins to become integrated. By the end of the year some 600,000 people are expected to be enrolled into the new system that automatically diverts funds from their pay packet.

The Department of Work and Pensions (DWP) has outlined more details on the number of people expected to be signed up during the initial waves. It estimates that 380,000 workers will be signed up in October, a total of 420,000 will be enrolled by the end of November, and 600,000 will be in place by the end of the year. WH Smith has selected an administration system to comply with its auto-enrolment obligations from 1 March 2013. The system automatically identifies and enrolls relevant employees into the chosen pension scheme, provides communication and administration processes, and handles refunds if an employee opts out.
The system is provided by Ceridian UK, which already supplies the retailer with its HRevolution Saas system for its 16,000 employees’ online pay slips. “Standard-alone Soft Ware/web system” for enrollment are another common approach. This spectrum involved the complementation of an automated solution to process open enrollment. The key business drivers for eliminating a paper-based process and replacing it with an automated process are :Reduction in the time it takes to process open enrollment. Elimination of the cost of producing and storing paper materials. Mitigating the possibility of human errors Increasing employee satisfaction.
Employees are not the only ones who benefit from enrollment automation. Thanks to the customizable, rules-based validation capability present with this technology, the system will catch errors and incomplete information, prompting employees to make changes as needed. Additionally, managers and administrators can run reports to discover who still needs to complete the process and send automated reminders, without having to chase down employees on a one-on-one basis. The workload on HR staff is greatly reduced, as employees can handle their own enrollment tasks. Research has shown that employees and administrators are still dissatisfied ith the level of information and analysis tools that are available with many stand-alone benefit enrollment applications. The fundamental issue is that in order to make informed healthcare decisions, employees, HR administrators, and financial buyers need access to data commonly found in HR/Benefits and workforce management systems. Access to demographic information alongside of benefit participation data allows a company to analyze trend so that no group is overpaying or under-utilizing the benefits. Data can be segmented into employee groups by: age, family status, location, need, pay scales, and multiple combinations of the above
Synchronizing data between the benefit enrollment software and the day-to-day HR/Benefit system is key to achieving this because employee data is always changing. This includes: Changing departments, Changing pay-grades, Changing locations, Changing family status, Changing work status and more The Completely Integrated Single System, one database approach is the best option available today. The need to make more information available to employees, and the need to streamline processes even further, are what drive many companies into the third and most advanced phase of their open enrollment evolution.
This is the sweet-spot, where businesses integrate their benefits enrollment system with their HR/Benefits and workforce management platforms to develop a truly comprehensive, integrated approach. When open enrollment is deployed as a complete, full-service package, an organization can: empower an employee’s decision process by providing all information they need including paystubs, HR, business rules and workforce management through one interface, enter and update all employee information in one place, maintain up-to-date data throughout the open enrollment process. now that data integrity is assured from system to system. Enrolment System are viable choices for schools, training programs, work place operations, educational institutions, such as colleges, and grade schools are assessed for a case. Online automated system is which accepts and organize enrollee information, can boost productivity. System operations proceeds faster, more efficiently and with greater accuracy than manual enrolment system. Programs are comprehensive and capable of handling all interrelated processes including: completion of all related forms, development, organizations, and maintenance of files.
Creation of master lists; other special reports. Free assessments and balances. Departmentalized accounts receivable reports; class schedules and records updates. Garcia (2002) created also the LSPZ Computerized Enrollment System. The study can be a great help to person concerned during the enrollment period. The registrar, the instructors of the students lessen the burden manually browsing over enrollment slip for record purposes. Dioso (2004) stated that computer assists careful intelligent planning organizing, actuating and controlling. This maybe observed from the past that they monitor production ctivities, solve scientific problem and help arrive in tentative answer to a multitude of involve condition Computer generated enrolment solutions afford students choices like options for payment, 24/7 that can benefit the whole operation. System typically include data protection and back up frameworks. Student’s enrollees have access to their personal information only. School personnel are able to keep up with teacher and students grades, pint averages and other pertinent identification data, such as grades, quizzes and any other modules that are considered necessary for efficient administration.
Modification in school policies and requirements are easily edited online. Carrier guidance and evaluations are facilitated and traceable. What the online enrolment program effectively amount to is a self-service. On-demand students and administrators friendly guide and process optimizer. Employee workload is decreased and administrators can spend more time giving students more personalized attention and encouragement. Also, the enrollment system is compatible with mobile app version for students and school staff. Some resources are available on download. com or torrent downloader. More reference links. ww. oppapers. com. www. scribd. com. This enrolment process in school is required by Information Technology. The objective of Information Technology is to help humanity from doing loads of work overtime. By having computerized system, the cost during enrolment will be cut down and much effort will be reduces. The project involves a series of studies that course all the requirements of creating a computerized enrolment system. The goal of the study is to provide an efficient computer-based system that will easily update, retrieve, and maintain students records. The developer concern is o make a system that will help to speed up process. Slow registration processing cannot blame anyone, because of the larger total population of the enrollees. School staff who incharge for the enrolment also encountered a lot of comments the the enrollees. They also encounter different kind of problems, stress and many more during the enrolment. The researcher aims to gather and record the effects of computerized enrollment system at Kings College of the Philippines from the enrolment of first semester from the School Year 2013 according to its status and year level of the enrollees.
Kings College of the Philippines (formerly Eastern Luzon College ELC Benguet) started from the year of 2004. Located Pico La Trinidad Benguet. It has been educated institution for nine (9) years now. Reverend Kwon Young Soo from korea is the founder of the said school. service inside the school weekly and during and every Sunday. One of the problems of Kings College of the Philippines during enrolment was the generation of forms. The students needed to fill-up a lot of copies of a registration form (copy for the students, accountant, registrar and dean). Students were consuming a lot of time in doing such.
Since students are manually filling-up the official documents of the school, data redundancy has a great possibility in causing further complexity in the enrolment process. The enrolment itself can be considered as a problem for the students. From evaluation to validation. It is a long process and students have to go back and fort in different offices to complete the enrolment procedures. It can be tiring and such a waste of effort for them. Another problem of the school was each of enrollment personnel especially in the accounting department which only had two (2) personnel to accommodate all the students during nrolment and payment of fees. The outcome was that the school had a hard time in accoumodating large number of students and that made it hard for other students to wait in line. For the past Seven (7) years the Kings College of the Philippines using manual process of enrolment. Enrollees need to fill-up a lot of registration forms and it is wasting time. A lot of enrollees did not understand or did not know what they will write on their form because no one can guide or assess them. On the other way the accountant or registrar cannot understand what had the enrollees written in the registration form.
Or sometimes the enrollees did not fill-up the form very well. It is October 2011, second semester enrolment when the Kings College of the Philippines used the computerized enrolment system. It is not only for the enrollees but also for the guidance. Accounting office and registrar. Library and many more. The idea of using this system was came from one of the Professor and the Accounting incharged Mr. Heginio Clyde Abellanosa. At present the Kings College of the Philippines using the enrolment system. The computer laboratory 4 is the encoding area of the said school. You can fill up all the necessary nformation, you can choose all the subject and schedule that you want and in a second you can get your enrolment form together with the total of your tuition fee. After that you can now proceed to the accounting office for you payment. Every now and then they can monitor the list by course of enrollees. The total numbers of enrollees from its year level. And so the total number of enrollees per semester. From this system you can monitor the record, list of payments or balances. The respective dean of the department can monitor their students in their own offices. METHODOLOGY This chapter contains research design used by the author.
The procedure that were used in this study namely: population and locale of the study, data collection instrument and statistical treatment of data. The method of research applied also in this research . This type of research aims to describe the data and characteristics of the case being studied. The idea behind this research is to study frequencies, averages, and other statistical calculations. Although this research is highly accurate, it does not gather the causes behind the situation. Descriptive research is mainly done when a researcher wants to gain a better understanding of a topic.
Explanatory research was also used by the researcher which is defined as carried out to ascertain that the occurrence of or the changes in the variable which leads to out comes as presented. Through this research, the researcher was able to find out how much variation is caused by another variable. This was appropriate for this study since it helped the researcher to have a better out come in terms of the description of the data in this research. Research Design This study entitled “Computerized Enrollment System: it’s effect to the enrollees of
Kings College of the Philippines particularly to the Criminology students is a descriptive method of research was used in this study. This type of research is to study frequencies, average and other statistical calculations. Descriptive reasons are defined as involving collection of data in order to answer the questions concerning the current status of the study. In this study questionnaire was the main data-collection tools. It is the most common used to gather information regarding the students that was utilized, in order to evaluate the effects of Computerizes Enrollment System to the respondents.
It helped the researcher to have a better outcome in terms of the description of the data in this research. The questionnaire answered by the respondents consist of socio-demographic profile and the affects of the Computerized Enrollment System to the enrollees of Kings College of the Philippines particularly to the criminology students. This design is used to get information fasts. Population and Locale of the Study This study is conducted at the Benguet Provincial Capitol Assessor’s Office to gather the information needed in the study. The personnel in charge in the office were the respondents.
The primary data to be gathered from the respondents were the possible solutions to the problems most often encountered in handling existing Real Property Tax Assessment transactions. This study was conducted at Kings College of the Philippines. Located at Pico. La Trinidad, Benguet during first semester of the school year 2013. The data was based on the enrollees enroled in the College of Criminal justice Education to gather the information needed in the study The case study entitled “Computerized Enrollment System: its affect to the enrollees of Kings College of the Philippines particularly to the criminology students”.
The case stated that the school’s enrollment process are time consuming, redundant students records, and has a slow retrieval of students record. The researcher dealt with transferees, freshmen, irregular students, returnee and professor, its effect to every enrollee. The researcher study how does it works faster than the manual form of enrolment. Sketch of Kings College of the Philippines Conceptual Framework The socio demographic profile of the enrollees such as age, and type of school graduated from. Year level and status could be possibly affected to the enrollees on the computerized enrolment system.
The present studies are used to determine the computerized enrolment system and its affect to the respondents. This study is delaminated on the Computerized Enrolment System: and its affect to the enrollees of Kings College of the Philippines particularly to the criminology students. Paradigm of the Study The operational paradigm of the study is where the Computerized Enrolment System, its affects and intervening variables are enumerated. Figure 1. Paradigm of the study shows the different variables of the study Independent Variable Dependent Variables Computerized Enrollment System Features | 1. Registration 2. Assessment 3.
Systematic Enrollment 4. School Fee Management 5. Computerized Student Account 6. Class Schedule 7. Secure Login System 8. Student RecordsOther: | A. Level of Computerized Enrollment System affecting the enrolleesRate:3-Very effective2-Effective1-Not effectiveB. Effect to the Students5-Always4-Often3-Sometimes2-Seldom1-Never| Dependent Variables So-Socio-demographic profile| 1. Age 2. Type of Secondary Educational Attainment 3. Year Level 4. Status | Statement of the Problem Enrollment plays a very serious role in every school premise. It is very important in every school and it acts as their foundation.
Each school has their own system in handling their enrollment. And for them to accommodate many students, they need to computerize their enrollment system, for them to make their work easier and easy to manage. The respondents are the students of College of Criminal justice Education. From first year to forth year. It consists of ______students. _____from 1st year, ______from the 2nd, _____ from 3rd year, _______from 4th year students. This study seeks to find and assess the effects of Computer Enrollment System to the enrollees of Kings College of the Philippines particularly to the criminology students at Pico, La
Trinidad Benguet. Particularly it seeks to: 1. Describe the Computerizes Enrollment System affecting the enrollees of Kings College of the Philippines particularly to the criminology students. 2. Identify the level of importance of Computerized Enrollment System to the enrollees Of Kings College of the Philippines particularly to the criminology students. 3. Recognized the degree of problem encountered by the enrollees of Kings College of the Philippines particularly in criminology students. 4. Identify why they used the Computerized Enrollment System The following hypothesis was tested in the study: 1.
The effect of Computerized Enrollment System to the enrollees of Kings College of the Philippines particularly to the criminology students are modified. 2. The level of importance of Computerized Enrollment System to the enrollees of Kings College of the Philippines are determined 3. The degree of problems encountered by the enrollees of Kings College of the Philippines particularly to the criminology students are Data Collection Tool The researcher has observed the Computerizes Enrolment System already. The utilization on the survey of the information flow within the Kings College of the Philippines.
The questionnaire consists of the following: PartI1 it is the form of check mark wherein the socio-demographic profile such as: age, type of secondary educational attainment, year level and status of the enrollees. Part II it is the form of check mark wherein the level of Computerized Enrollment System that affect the enrollees is enumerated based on the observation. Part III it is the form of check mark wherein the level of importance of Computerized Enrollment System to the enrollees particularly to the criminology students. Most inquires written in the questionnaire were made by the researcher and other were ased on their observation. Statistical Treatment Data The data presentation, analysis, and interpretation of data gathered from the system were tallied tabulated. They were treated weighted mean to Computerized Enrollment System. Its affect to the enrollees of Kings College of the Philippines particularly to the Criminology students. This treatment are necessary information sought without losing its validity. It was used to enhance the researcher a clearer analysis and easier interpretation. The data come up with the _____1st years, ______2nd years, _____3rd years, ____and 4th year enrollees of Criminology.
It discusses the number of problems in using manual registration form from the number and status of enrollees for the first semester enrolment school year 2013. According to the status of enrollees like returnee, transferee, freshmen, professor and the time and typed of workload lessen. Name: (optional)_______________________ Age: _______15-20, ________21-25, __________26-30 Type of school graduated from: _______Public _________Private Year Level: _____1st year, ______ 2nd year, ________ 3rd year, _______4th year Status: _____Freshmen, _____Old, _____Transferee, ______ Returnee ______ Professor 1.
What are the Computerized Enrollment System affecting the enrollees during enrollment of Kings College of the Philippines particularly to the criminology students? Direction: Please rate the level of Computerized Enrollment System that affects the enrollees during enrollment. Where: 3 -Very Effective2- Effective 1-Not effective Computerized Enrollment System| 3| 2| 1| Registration| | | | Assessment | | | | Systematic Enrollment| | | | School fee management| | | | Secure Login System| | | | Computerized Student Accounts | | | | Class Schedule| | | | Students Records | | | | Other | | | | 2.
What are the level of importance of Computerized Enrollment System to t enrollees Kings College of the Philippines particularly to the criminology students? Direction: Please rate the importance of Computerized Enrollment System to the enrollees during enrollment. Where: 5-Stongly Agree 4-Agree 3-Disagree 2-Strongly Disagree 1-Undecided Effects to the Enrollees | 5| 4| 3| 2| 1| More personalized attention| | | | | | Decreased workload| | | | | | More efficiency| | | | | | Greater accuracy| | | | | | Faster processing | | | | | | Systematized recording and verifying of data| | | | | | Easier way of registration| | | | | |
Reduce redundancy| | | | | | Enrollment forms that are checked manually | | | | | | Lessen errors | | | | | | Fast summarizing of reports| | | | | | Updated reports of payments | | | | | | Other | | | | | | 3. What are the degree of problem encountered by the enrollees of Kings College of the Philippines particularly in criminology students? Direction: Please rate the degree of problems encountered by the enrollees of Kings College of the Philippines particularly in criminology students. Where: 5-Always 4-Often 3-Sometimes 2-Seldom 1-Never Degree of Problems| 5| 4| 3| 2| 1| Falling line| | | | | | Wasting time| | | | | |
Filling-up and signing| | | | | | Encoding | | | | | | Printing | | | | | | Other | | | | | | Objective: The researcher aims to accomplish the following: 1. Determine the specific reason why Computerizes Enrollment System affects the enrollees of Kings College of the Philippines particularly to the criminology students. 2. Determine the positive and negative effects of the Computerized Enrollment System to the students 3. Show the consequence of excessive use of Computerizes Enrollment System to the students. 4. To gather more information, to study its contribution to the students ,and to share out idea about the system to the students.

Categories
Computer

Buildings/Architectural Designs Without Use of Computers

Introduction
In this new age of globalization and competition, Architects have to keep pace with the advancements in technology. In building design, computers aid in drafting, modelling, and visualization. This essay analyses building and architectural designs that would not have been possible without the use of computers and cites relevant design examples that have made use of Computer Aided Architectural Design (CAAD) software. The impact and effect of the use of computers in the development of building and architectural designs will be analyzed as well as what events and external ideas have been incorporated in the development of these architectural designs. Comparison will also be made with other designs that were practiced earlier on, citing what advantages the present methods have against the earlier ones. The way that these designs are impacting on society will also be expressed and the influence they have on the community will also be considered in the discussion. Analysis will also be based on the impact of building to environmental sustainability.

The history of building and architectural designs is faced with a number of subsequent trends. These trends are based on the design, materials used, technologies and the buildings’ durability. Earlier on, buildings were constructed with perishable materials such as leaves and animal hides. Later, timber, brick and mortar and concrete were embraced. Currently, synthetic materials are now being used to create more daring and complex structures. There is also the trend of the construction of larger structures created with the use of extremely strong materials. Interior environmental control is also one of the trends witnessed. The need to control temperature, humidity, light and many other aspects is now becoming an inherent part of building design. ‘The practice of architecture emphasizes spatial relationships, orientation, and the support of activities to be carried out within a designed environment’ (Denison 2009). Architectural designs are mainly distinguished in terms of their uniqueness and how appropriate they are for a particular use or function (Denison 2009). The technology used in the preparation of these architectural designs will be my main point of discussion in this essay.
‘Architectural Traditionalism’ has emphasized on hand work in the preparation of building designs. This method of design is being phased out with ever evolving creative ideas crafted using the aid of computer developed architectural designs. Architectural firms are now using advanced studio software as well as on-site technology to outsmart their competition. The use of computers in the creation of designs assists architects who develop designs which gain global acceptance in terms of uniqueness, durability, function ability and environmental sustainability. The use of computers in building design has also assisted in improving the presentation and quality of architectural drawings. Adopting digitalized computer systems and programs such as CAAD will therefore see architects develop more magnificent designs and in turn reap the benefits therein (Tai 2012).
The use of computers in architectural design has made it possible for architects to make use of complex and multifaceted design information. This is possible as a result of the development of computer software which has made it possible to come up with complex designs before the actual process of construction. Computer-aided architectural design programs have resulted in more accurate designs and comprehensive records when it comes to building designs. Ever since 1960, computer aided design programs (CAD) were used by architects in coming up with architectural designs. However, this software lacked some tools which architects considered relevant during their architectural projects. This therefore resulted in the development of a distinct class of software specifically designed for use in architectural design: CAAD.
There are various architectural designs which may not have been possible without the use of computers.
Examples of architectural designs
Without going into explicit detail on the various architectural designs which have utilized computer software to be generated, a known example is the Burj Dubai project whereby the Burj Dubai tower is an architectural design which was created with the use of a computer aided architectural design software program. The Sliding House, Reflection of a Mineral, Byron Bay house, Hangar Prefab, Swiss Charlet and the Marinette Residence are some examples of architectural designs which were generated as a result of the use of computer aided architectural design (CAAD) (Bruinessen, Hopman, DeNucci & Oers 2011).
The London Gherkin is also an example of a building with a complex design structure which required the utilization of computer aided software to assist in the architectural design. The London Gherkin is known to have an unusual design structure which could not have been designed without the use of computers. It is classified among the 9 most mathematically fascinating buildings in the world. The building is round with a narrow top and a bulge at the centre. In addition to the use of CAAD, parametric modeling was also used to design the building (Josie 2011).
The Zaha Hadid building in Hong Kong is also an example of a building with a complex design structure which could not have been created without the aid of a computer software design program. The computer aided design allowed the architects to come up with various design shapes without first settling on the identified shape for the building. Despite the fact that this was a tiring task, it would not have been possible to achieve without the use of computers in the design process (design museum + British council 2007).
Another important example of a building with a complex design structure is the Frank Gehry’s Fisher center in NYC. The design of this building would not have been possible without the use of computer aided architectural design. This building was named after the architect who designed it that is; Frank Gehry. This architect is known to have indulged in various architectural designs ranging from small to larger buildings. Despite the fact that he had a lot of experience when it came to architectural design, he could not have designed the identified building without the aid of a computer. The building’s style was deconstructive post modern. In addition to this, the design ensured that the ground floor was actually wrapped around sides of his older house. The wrapping extended the house in that it reached the streets (Archinomy 2010).
The examples of designs provided above are known to have utilized complex blueprints which could not have been generated accurately without the use of computer programs. New and even more complex designs as still being generated with the help of different computer programs and CAD drafting techniques such as; AutoCAD. The design process of a building requires that the provided architectural drawing be up to date and accurate so as to enable the physical characteristics of the buildings being constructed to be easily defined.
The use of computer technology for the production of the sophisticated architectural designs is considered to be more advanced compared to the use of traditional methods since it ensures that the user is provided with input tools which ensure that the design process is streamlined, drafted and documented for an easier understanding. This makes it easier and more achievable to come up with complex architectural designs without necessarily doing all the design works in one day.
How the Work Compares With Other Practices of Design
Computer generated architectural designs make use of computer aided design and drafting (CADD) which is software that provides architectural graphics in the form of vectors. The graphics are later utilized in depicting objects and at the same time producing a raster graphic of an object which is to be designed. Comparing this to traditional or other forms of designs, computer aided architectural designs are considered to be more accurate and complex.
In addition to this, the designs which are generated through the use of computer software programs can be postponed and cleared at a future date since it requires the architect to save the work and re open it when ready to continue with the design. Other practices of design do not require the use of sophisticated software to come up with the final design product. The reason behind this is that the various design practices involved are not considered to be that complex when compared with the design of a house (Shaffie 2011).
The Impact and Effect of the Use of Computers in the Development of Building and Architectural Designs
The use of computers in the architectural sectors has reflected both negative and positive impacts when it comes to building and architectural designs. Taking into consideration the positive impact that the use of computers has brought within this sector, it is evident that the use of software has made it possible for architects to come up with more accurate designs. This comes about as a result of the fact that the computer aided software programs which are used in architectural design makes use of vector coordinates to show building measurements which are considered to be more accurate compared to if the architectural design was being designed manually without the use of a computer but through the use of a ruler and a pencil (Bruinessen, Hopman, DeNucci & Oers 2011).
In addition to the accuracy, the use of computers has ensured that the designs developed in architecture are produced faster in comparison to if the computers had not been utilized. One of the advantages associated to computers is speed. Therefore any activity carried out through the use of computers is done within a limited time period compared to if a computer had not been used to carried out the activity.
Every factor which has an advantage is also associated with various disadvantages. This is no exception when it comes to architectural design and the use of computers. For instance, for an architect to come up with a building’s design through the use of computer aided software programs, the architect must to be computer literate. This means that the architect has to spend more time and money to become computer literate so as to be able to make use of the computer software program as required. (Bruinessen, Hopman, DeNucci & Oers 2011).
Events and External Ideas Have Been Incorporated In the Development of These Architectural Designs
There are four major events which marked the early development of computer aided architectural design. The first event is the studies which were carried out by Clark and Sounder which formed the foundation of the layout which was computer aided. This event took place in the years 1963 and 1964. The second event which took place in 1965 involved the manipulation and representation of buildings which were graphical in nature as objects. The third event involved the act of laying down foundations which were used for methods of design. In 1964, this view was considered important to computers use in the architectural design. The final event involved the architectural machine idea. This event took place in the year 1972 within the field of robotics. The machine was meant to be intelligent enough in that it could be involved in any activities of design whilst cooperating with the architect involved in a dialogue (Shaffie 2011).
Computer Aided Architectural Design Impact on the Society
Concepts which are available in the applications of CAAD are known to have an impact on the creativity and innovation capabilities of the people who use them. Therefore to the society CAAD creates an opportunity for researchers, students as well as professionals, who show an interest in the identified concepts of CAAD as this assists them in building their knowledge on these concepts as they are going to be involved in debate on the various lessons from the past concerning architectural design as well as the present and future impacts on the CAAD innovation. In addition to the reason above, computer aided architectural design has also played a major societal role of ensuring that there is proper urban planning as well as city and regional planning. This plays a part in reducing the disadvantage of congestion and any security or safety risk which may be associated to it (Bruinessen, Hopman, DeNucci & Oers 2011).
Comparison to Earlier Designs
Taking into consideration various architectural designs which have been produced in the architectural field these days, it is evident that the designs produced are more complex compared to the earlier designs. Other than complexity, the designs produced through the utilization of computers are of a higher quality compared to the designs which were earlier produced without the use of computers. This has resulted in various advantages and disadvantages setting in such as: improved safety. The use of computers in architectural design has ensured that there is an improved level of human safety since the chances of buildings collapsing have also been reduced as a result of better design procedures.
Other than the advantages, there are also disadvantages associated with the use of computers in architectural design for instance; the fact that architects have to be educated on how to use computers and software is a major impediment. This means spending more time and funds to be able to make use of CAAD. More time spent on design is also a disadvantage associated with the use of computers in today’s design in comparison to earlier design. The reason behind this is that most of the architectural designs which are delivered through the use of computers are complex in nature which means architects have to pay more attention to ensure that the output design presented is accurate as desired.
Influence of Computer Aided Architectural Design on the Community
Computer Aided architectural design has greatly influenced the community in that it has promoted the existing levels of interaction among the individuals of the community in that community members identify the designs they require and the architects involved ensure that the designs are produced through the help of computers. In addition to this, it has also promoted the existing level of communication which exists in a community in that as architects meet to discuss the various building designs, they take into consideration and the various views that the members of the community may have provided with regards to the type of buildings that they want to have in their neighbourhood (Bruinessen, Hopman, DeNucci & Oers 2011).
Impact of Building to Environmental Sustainability
Green building has gained momentum as the impact of environmental sustainability has continued to rise in the community. The buildings in which people live in, work in and play in are meant to offer protection from the extremes of nature which may have a negative effect on our health and the surrounding environment in various ways. Green building, also referred to as sustainable buildings have ensured that people are living within constructions considered to be healthy and efficient when it comes to the utilized resources.
Buildings ensure sustainability depending on the construction materials used as they ensure efficiency of materials such as energy and water and at the same time ensuring that the risks posed when it comes to health are also limited (Bruinessen, Hopman, DeNucci & Oers 2011)
Conclusion
In summary, the use of computers within the architectural field has proved to be advantageous since it has resulted in the development of architectural designs which are complex in nature. However, architects are required to be familiar with the use of computers to be able to make use of the computer aided architectural software to come up with the complex designs which is a disadvantage.
References
Archinomy, 2010. Viewed from http://www.archinomy.com/case-studies/1931/frank-owen-gehry
Bruinessen, T, Hopman, H, DeNucci, T, & Oers, B 2011, ‘Generating More Valid Designs during Design Exploration. (Cover story)’, Journal Of Ship Production & Design, 27, 4, pp. 153-161, Academic Search Premier, EBSCOhost, viewed 24 April 2012.
Design museum + British council, 2007, “zaha Hadid architecture and design”. Viewed from http://designmuseum.org/design/zaha-hadid
Desinon, E, 2009, the History of Building Design. Viewed April 23, 2012 from http://www.desinon.com/index.php?option=com_content&view=article&id=50:the-history-of-building-design&catid=35:about-us
Josie, W, 2011. Trip base‘9 most mathematically interesting building in the world”. Viewed from http://www.tripbase.com/blog/9-most-mathematically-interesting-buildings-in-the-world/
Shaffie, H, 2011. ‘The Roots of computer aided architectural form generation’. Viewed in April 24, 2012, from http://faculty.ksu.edu.sa/hs/Research/COMPUTER%20APPLICATIONS%20IN%20ARCHITECTURE%20-%20FORM%20GENERATION%20TOOLS.pdf
Tai, L, 2012, LANDSCAPE JOURNAL. Assessing the Impact of Computer Use on Landscape Architecture Professional Practice: Efficiency, Effectiveness, and Design Creativity. Viewed April 23, 2012 from http://lj.uwpress.org/content/22/2/113.short

Categories
Computer

Computer Thesis

BOUND MANUSCRIPT FORMAT Font: Bookman Old Style, Size 12 Margin: L – 1. 5”, R – 1”, T – 1”, B – 1” Spacing: Single Space (Title Page, Approval Sheet, Executive Summary, Abstract, Appendices) Double space (Body, Table of Contents, Acknowledgement) Page Number: Top-Right of the Page (No page number on the first page of each chapter and on appendices) Table Number and Name: Before the table (left alignment) Figure Number and Name: After the figure (center alignment) TITLE PAGE EXECUTIVE SUMMARY iii EXECUTIVE SUMMARY iii APPROVAL SHEET ii APPROVAL SHEET ii Title (Bold, ALL CAPS) A Project Study presented to the Faculty f the College of Computer Science In Partial Fulfillment of the Requirements for the Degree Bachelor of Science in Information Technology Proponents: First Name MI Last Name (arrange alphabetically – Last Name) October 2012 Title (Bold, ALL CAPS) A Project Study presented to the Faculty of the College of Computer Science In Partial Fulfillment of the Requirements for the Degree Bachelor of Science in Information Technology Proponents: First Name MI Last Name (arrange alphabetically – Last Name) October 2012 LIST OF TABLES v LIST OF TABLES v TABLE OF CONTENTS iv TABLE OF CONTENTS iv 2 Introduction Objectives General Specific
Scope and Delimitations Review of Related Literature (Implemented previous studies— international, national, local,) Technical Background Existing System Hardware Specifications| Software Specifications| | | Proposed System Recommended Hardware Specifications| Software Requirements| Description 2 Introduction Objectives General Specific Scope and Delimitations Review of Related Literature (Implemented previous studies— international, national, local,) Technical Background Existing System Hardware Specifications| Software Specifications| | | Proposed System Recommended Hardware Specifications| Software Requirements| Description
Chapter I INTRODUCTION Project Context Present scenario/settings/procedure with the existing system Problems encountered with the existing system Purpose and Description Features of the Proposed System that will solve the problems encountered Benefits that can be derived from the Proposed System Chapter I INTRODUCTION Project Context Present scenario/settings/procedure with the existing system Problems encountered with the existing system Purpose and Description Features of the Proposed System that will solve the problems encountered Benefits that can be derived from the Proposed System Chapter II METHODOLOGY

A. Requirements Specification Operational Feasibility Fishbone Diagram (Add Description Below) Schedule Feasibility Gantt Chart Cost-Benefit Analysis Data and Process Modeling (Diagrams for the Proposed System) * ERD * Context Diagram * DFD * System Flowchart B. Design * Screenshots (forms), Sample Reports Chapter II METHODOLOGY A. Requirements Specification Operational Feasibility Fishbone Diagram (Add Description Below) Schedule Feasibility Gantt Chart Cost-Benefit Analysis Data and Process Modeling (Diagrams for the Proposed System) * ERD * Context Diagram * DFD * System Flowchart B. Design Screenshots (forms), Sample Reports 8 Methodology C. Development Hardware Specifications Software Specifications Programming Environment * Front End * Back End D. Testing Plan (Testing plan during the development) E. Maintenance Plan * Gantt Chart (Description after the figure) 8 Methodology C. Development Hardware Specifications Software Specifications Programming Environment * Front End * Back End D. Testing Plan (Testing plan during the development) E. Maintenance Plan * Gantt Chart (Description after the figure) BIBLIOGRAPHY Trajano, Emily, “Visual Basic: An Introduction to Object Oriented Programming”, 2008
APPENDICES A. Source Code B. User’s Guide C. Grammarian Certification D. Other Relevant Documents CURRICULUM VITAE (Personal Information, Picture, Educational Background, Seminars/Trainings Attended) BIBLIOGRAPHY Trajano, Emily, “Visual Basic: An Introduction to Object Oriented Programming”, 2008 APPENDICES E. Source Code F. User’s Guide G. Grammarian Certification H. Other Relevant Documents CURRICULUM VITAE (Personal Information, Picture, Educational Background, Seminars/Trainings Attended) Chapter IV IMPLEMENTATION PLAN Description Implementation Contingency Schedule of Testing (Gantt Chart) testing plan during deployment) Project Implementation Checklist Activities| Finish| Not Finish| On-going| 1. Installed IS| v| | | Chapter IV IMPLEMENTATION PLAN Description Implementation Contingency Schedule of Testing (Gantt Chart) (testing plan during deployment) Project Implementation Checklist Activities| Finish| Not Finish| On-going| 1. Installed IS| v| | | Chapter III RECOMMENDATIONS Chapter III RECOMMENDATIONS Samples… Table 1. Distribution of Middle Level Managers in terms of Age, IFSU 2011 Age| Frequency (F)| Percentage (%)| MEAN| SD| 26 – 30| 1| 4. 5| 47. 55| 9. 16| 31 – 35| 0| 0| | | 36 – 40| 5| 22. | | | 41 – 45| 3| 13. 6| | | 46 – 50| 6| 27. 3| | | 51 – 55| 1| 4. 5| | | 56 – 60| 4| 18. 2| | | 61 – 65| 2| 9. 1| | | TOTAL| 22| 100. 0| | | Figure 4. Gantt Chart of Schedule of Activities Chapter 1 INTRODUCTION Project Context Every organization is concerned with the modernization of their firm to become competitive. Our environment and its usefulness, in business transactions operation, education and others, already know computerization. It supported the success of each individual to gain benefits as the time period, less energy to exert and less number of person involve in processing such job.
For this reason many-concerned citizen continue to contrive and investigate various type of applications that they aim to gain advantage of the adoption of modern technology gives big improvement in a company. It will show in a certain firm that there is development. The researchers observed that their recording and other operation are slow when done manually. Purpose and Description The success of an organization depends on its ability to acquire accurate and timely data or information about its operations, manage data effectively and use it to analyze the organizations activities and operations. sample format of citations) According to Earls M. Awad, “System is an organized group of components or elements linked together according to a plan to achieve an objective”. Information is needed in virtually every field of human thought and action. It generally supports that computerization of information system is considered a great advantage in an organization. The job performance of management graduates employee and non-management graduates as computerized and is a student profile system operator.
Both of them are with computer operations background. She compared the ratings given by the administrators in the performance of the both. One of the clients’ findings of the study was that there is a significant difference between the job performance of the management graduates employee and non-government graduates as assessed by the administrators and the big difference is in favor of management graduates employee for the reason that records is more fast and easy. (http:www. Sourceface. commanagement_1
ote35. html)