All posts by Mike Schadone

Revisiting My Goals

When I applied to Walden University, there were some choices that I needed make in regards to which program I would enroll in. I relied on my past experience and some of my current goals to direct me to the Bachelor of Science degree in Computer Information Systems (BSCIS) with a concentration in Information Systems Security (ISS), a process which truly motivates me. Revisiting my goals and lending them power to help navigate the world of academia, I needed to ensure that these goals still held true. The first assignment in the Introduction to Information Systems class afforded me the opportunity to do just that, while this assignment will allow me to review my goals once again.

My affinity towards positive social change (Schadone, 2009) is unwaivering, as is my desire to achieve a position in the field of disaster management. I do feel, however, that my chosen degree program is ill-prescribed to prepare me for such ambitions. Though there has been a great incentive to involve the engineering sciences into public policy administration (Connolly, 2009), my experience with the BSCIS degree, even with the ISS concentration, leads me to believe that the curriculum does not satisfy my current needs or goals. I do believe that a career in Information Systems Security would provide an opportunity to reach many of my goals, but other academic directions would provide a more solid foundation for me to build upon.

As of this writing, I have decided to research other avenues of academia which might be better suited to providing the core educational opportunities that would benefit me the most. I have decided that the B.S. in Health Studies with a concentration in Health Administration would be a better fit at this time. I hope to use this degree to propel me forward into an opportunity to earn an MPH with a concentration in Emergency Management and, ultimately, a Ph. D. in the same.

As the H1N1 influenza virus reminds us all about the 1918 “Spanish Flu,” there is an undertone of personal responsibility and preparedness in the event of a pandemic (Bornstein & Trapp, 2009), of which conditions are favorable. I plan to take personal responsibility in this and other potential disasters to position myself as an expert in the field helping to promote plans and policies to mitigate and respond effectively to such incidences. Though, I am versed in the computer sciences, I feel that my position as a health official would be better utilized in these times of crisis. Perhaps one day in the future I will return my focus on computing, but until then, my social conscience and sense of community seem to be my only defining factors.


Bornstein, J., & Trapp, J. (2009, June). Pandemic Preparedness: Ensuring Our Best Are Ready to Respond. IAEM Bulletin, 26(6), pp. 6, 14. Retrieved August 22, 2009, from

Connolly, J. (2008, September). Bridging the gap between engineering and public policy: A closer look at the WISE program. Mechanical Advantage, 19. Retrieved August 22, 2009, from

Schadone, M. F. (2009). Information Systems and Me: My Professional and Career Goals. Minneapolis, MN: Walden University.

A Datastore Discussion

O’Brien and Marakas (2007) explain the importance of disaster recovery in regards to a company’s computing resources, “Many firms could only survive a few days without computing facilities. That’s why organizations develop disaster recovery procedures and formalize them in a disaster recovery plan.” This is the basis of the business plan submitted in a subsequent assignment (Schadone, 2009) in which the focus is mitigating computing loss and recovering.

Information Technology relies on the acquisition, storage and retrieval of pertinent data. The development of a business plan leads one to require the adaptation of a data schema to manage the influx of information which could be useful to a growing company, if not required within the functioning business. Figure 1 reveals the core data schema for tracking customers and their needs. This schema is certainly not all-inclusive but provides a framework which can be built upon depending on the corporate direction and specific requirements.
This information can be utilized, obviously, to provide for the clients’ needs, but it can also be utilized to provide increased organization and specific solutions based on measured metrics. As figure 1 shows acquisition and storage of demographic information, it also allows for the assignment of specific roles. These roles can allow a portal application to provide only the most needed system monitors for each role, including role-specific alerts and notifications. Also, greeting the client by name, dependent on login data, provides an air of security prompting the user to logout when the session is complete. This demographic data can also be useful in providing personalized support by allowing support personnel access to each contact’s information and provide a basis to create a schema specific to Online Support Systems based on customer needs. Whether storing preferences or previous form entries, a personalized use of collected data can simplify processes for the user making the user more efficient in the end.


O’Brien, J. A., & Marakas, G. M. (2007). Introduction to information systems (14th ed.). New York: McGraw-Hill/Irwin.

Schadone, M. F. (2009). Disaster Response and Mitigation – IT: A Business Proposal. Minneapolis, MN: Walden University.

Figure 1.
Datastore Chart

“Disaster Response and Management – IT” (DRAM-IT)

With the growing focus of disaster mitigation, response and recovery, companies that rely on information systems need to prevent and minimize the impact of disasters (whether natural or man-made) to their infrastructure. Society’s focus is to regain a sense of normalcy which requires a functioning economy, thereby increasing the need for companies to recover quickly.

By providing expert philosophies, procedures, systems and tools, DRAM-IT can ensure that the client will transition seamlessly from pre-disaster to post-disaster with no negative long-term effects.

We start with employee-focused health, safety and security. We believe that the employee is the first defense against failure. Employees should be healthy and not have their minds occupied by other domestic problems (e.g. family welfare) which is why in times of a disaster affecting the community, we contract with armed security agencies to provide force security for key employees and their families. This focus allows other employees to take care of their own before returning to work. The same security force will provide on-site perimeter security allowing employees to feel safe while aiding in recovery efforts. But, before the incident occurs, we will create processes to assist each employee in staying healthy and fit, both physically and mentally, including the creation of medical response teams to manage on-site medical emergencies until EMS can arrive.

Data loss can be immeasurable and therefore cannot be tolerated. After performing a forensic analysis of current IT practices, DRAM-IT will offer methods of securing data with redundant distributed arrays with cryptographic and hashing intelligence ensuring the data has not been and cannot be manipulated. Along with distributed storage, we can offer distributed processing to ensure the business keeps running without a need for direct input by employees.

During a disaster, the focus needs to be on initiating recovery processes and requires interfacing with local authorities to be part of the solution. We will provide the internal Incident Command structure which will integrate with the local, State, and Federal efforts to ensure pooling of resources. We are also committed to the community. The faster the individual entities of a community can recover, the faster the community as a whole can heal.

With DRAM-IT Systems Mitigation, Response and Recovery, we can ensure that you can concentrate on what is important… we’ll take care of the rest.

By providing an all-encompassing approach to disaster management, our clients can be assured of continuous critical systems processing, ensuring business continuity throughout the disaster.

Table Title: Examples of Structure and IT needs
Functional Area (See Figure 7.23) Supporting Information Systems (See Figure 1.6)
Example: Human Resource Management Example: Transaction Processing Systems
Command Executive Information Systems
Operations Decision Support & Strategic Info Systems
Tactical Knowledge Management & Expert Systems
Logistics Specialized / Transaction Control Systems
Finance Specialied / Transaction Control Systems

Subject: Investment Opportunity – “Disaster Response and Management – IT (DRAM-IT)” 02/25/14
To Whom it May Concern

I am writing you as an entrepreneur in support of the community. We have faced a number of disasters recently and our economy continuously suffers. I hope to provide a host of services to companies which are key to the community infrastructure. My goal is to be able to assist these key companies in recovering from the disaster internally and allowing the economy a maximized benefit in a minimal amount of time.

As a critical care paramedic who has worked with FEMA response teams in the past years, I have the experience and education to know what is crucially important during a disaster. As a computer programmer and IT professional, I know how to apply my knowledge to critical business systems ensuring a smooth transition during the various phases of a disaster, whether large or small, internal or external.

I wish to be able to provide mitigation training, on-site employee health programs, redundant communications, secure data storage and retrieval with distributive data processing, personal and protective security and adaptive processes and philosophies that can overcome even the most destructive of forces. We will initially be focused on consulting with the promotion of best-practices in mind. During the disaster phase, we will respond directly as Incident Command Teams that will be fully self-sufficient for over 72-hours to ensure the response and recovery are as smooth as possible.

The unfortunate reality is that this endeavor will require a large amount of start-up capital. We must first hire and train appropriate personnel who can then consult to client companies and ensure they can operate effectively during and after a disaster. We also need access to distributive networks with which to operate. These will undoubtedly be fee-based services, but initial investments of processor-time and storage would be invaluable. Investing in this opportunity is investing in the community.


Michael Schadone

Does IT Matter: An Article Review & Response

Review: Information Technology

In his article, Carr (2003) discusses the economic growth versus ubiquity of Information Technology and the impact this has on corporate stability. Carr likens the emergence and innovation of Information Technology to that of the electric power grid and the continental railroad; where those who invest in emerging technology tend to initially out-pace their competition, the competitors who wait for standards to emerge from the same technology can steadily grow beyond at a lower cost. In contrast, those who do not adopt the standard of the innovation are unable to compete within the market and eventually fail.

Carr continues to describe business’ dwindling reliance Information Technology as a cutting-edge innovation and instead as a required commodity with the largest risk being overspending. This phenomenon is stated clearly in the economics law of diminishing marginal utility which states that, as one obtains more of a particular good, eventually the marginal utility declines. Businesses require Information Technology to stay competitive but are now required to focus on efficient use of the technology. As Carr states, the bubble has burst and the time of initial investment has passed.

Standards have been established and innovation will occur steadily and in stride. Like with electricity, businesses need to have Information Technology incorporated within their business model, but gone is the time of unfettered spending. A strong IT infrastructure is certainly a requirement in this age of computing, but there must be a plan in place to implement any further innovation and avoid overspending on a resource that may provide very little in the way of financial return. In the upcoming years of this young industry, IT professionals must learn to focus their efforts and clearly delineate needs and solutions.

Response: Information Technology

Does IT matter? In the age of computers, it, of course, matters. The real question is where does IT matter, or where can IT matter. In their editorial discussion, Grover et al. (2009) posits that there needs to be methods of allowing dissenting views to be heard and argued in a forum that fosters positive growth. Perhaps, in this domain, IT professionals can come together and provide positive solutions to serious problems effecting the information field.

Many other professions, young and old, face this same dilemma. As an example, in the 1980’s when firefighters changed their focus from fire suppression to fire prevention, their efforts were so overwhelmingly good that the incidence of residential and commercial fires decreased and there was no longer a need for so many firefighters. Luckily for the firefighters, there were other niches to fill, and firefighting jobs, though less specialized now, are no longer threatened for a lack of need (Falkenthal, 1999).

We need to find niches for IT. We need to understand where application of IT provides the best solutions. I put forth that IT professionals should be looking for ways to improve the non-IT world.


Carr, N. G. (2003). IT Doesn’t Matter. Harvard Business Review, 81(5), 41-49.

Falkenthal, G. (1999, March). It’s time for us to reclaim our fire service. Fire Engineering, 152(3), 32-35.

Grover, V., Straub, D., and Galluch, P. (2009). Turning the Corner: The Influence of Positive Thinking on the Information Systems Field. MIS Quarterly, 33(1), Iii-viii.

Innovation of Technology

Any expansion of the core infrastructure has historically driven technological growth spurts. From the advent of fire, electricity, and assembly-line manufacturing, there has been huge growth in technology following these cataclysms, but what is truly impressive is the exponential growth when these technologies are combined.

The telephone is a great example of this growth. Telephone systems evolved from the telegraph when Alexander Graham Bell combined his expertise of acoustics and oration to his knowledge of electricity. Bell, at the time, was attempting to perfect a multi-band telegraph, or a musical telegraph (Casson, n.d.). I feel that Bell’s contribution to the telephone and others succeeding in the field resulted in his lifelong dream of the musical telegraph being realized as he meant for it to be, unfortunately well after his death. The computer modem is such a device using multiple tones in quick succession to communicate with other computers with modems. The same concepts have been applied to promulgate broadband technology which most of the world now relies upon.

Whenever an innovation of technology occurs, it allows more people more opportunity to expand on it. With this in mind, I feel the biggest benefit of Internet2 and IPv6 would be the spark of innovation that is sure to come soon after acceptance.


Casson, H. N. (n.d.). The History of the Telephone. Electronic Text Center, University of Virginia Library. Retrieved on 22 June 2009 from∂=1&division=div1

Wireless reliance

Wireless information appliances will improve overall performance and communications, but will also have an adverse impact as we see today with Blackberry devices. Many people who work with Blackberry devices disregard them during off hours as they become bothersome. This is detrimental as the instantaneous notification is usually expected to be answered immediately. We will see more of this affect towards these devices. On the other hand, those people who welcome the ability to be connected and available at all times will be more accessible and therefore viewed by others as in a better light, perhaps. These people will become the “go-to” people and increase others perception of them on the network. This will lead to reliance on in-house electronic social networking to promote the usefulness of improved connectivity. Realistically, organizations must be clear on the expectations of the responsibilities of having increased connectivity with these and other wireless information appliances.

Another issue with increased connectivity is the increase in the opportunity of exploitation. As Metcalfe’s Law states that a network becomes exponentially more valuable as the user base increases, the inverse of Metcalfe’s Law should also hold true in that the network becomes increasingly vulnerable with a significant risk in membership and the connections themselves. Security becomes exponentially important as the network becomes more valuable.

Whenever I talk about network security, I try to relate it to the brick-and-mortar world: Homes in rural areas with unlocked doors are more secure than the dead-bolted homes of the urban environment.

Comparison of Property Management Software Solutions

In an attempt to isolate two possible property management software solutions from the many available, it was imperative to look at a significant number of options and choose the best two options from that list. Six different software solutions (Buildium™, PropertyBoss™, Propertyware™, Rent-Right™, Tenant-Pro™, and Yardi™) were chosen to be included in the initial comparison based on an internet search for “property management software” and an ad-hoc conversation with a property management professional. The search was specific to the management of 50 properties without regard to specialization in the various property types as it is assumed that any robust solution should be able to handle any property type with little modification.

Figure 1 shows the six chosen solutions, in table format listed in random order of search discovery, correlated with ten requirements (a document management system, a property portfolio, a tenant portfolio, automated financial management, tenant complaint and incident tracking, a service request management system, a work order management system, vacant property marketing tools, tenant screening tools and customizable report generators) which were chosen as valuable traits for any property management solution. Two of the solutions are commercial, off-the-shelf (COTS) solutions while the remaining four are internet-based solutions, or products from application service providers (ASPs). The two COTS solutions (Rent-Right™ and Tenant-Pro™) were immediately discounted as they met the least of the requirements while not having any comparatively significant cost savings (Domin-8 Solutions, Inc., 2007, 2008).

Of the remaining four solutions, all ASPs, Yardi™ (Gnosio, 2002a, 2002b) became obvious as the most costly option and was discounted as a viable option leaving three solutions to compare.

The three remaining solutions appear to be robust and feature-rich with comparable pricing. Of the three remaining solutions, PropertyBoss™ was removed from consideration for lack of available information on specific pricing options. PropertyBoss™ also proved to be the most costly of the remaining three products (O’Bannon, 2006; Real Estate Center at Texas A&M University, 2004).

The greater list of six solutions has been narrowed down to two robust and viable options for a property management solution, Buildium™ (2009) and Propertyware™ (2008). Though both would be fully recommended based on the initial research and greater product comparison, the stated goal is to narrow the available choices to a single recommendation. To this end, a detailed comparison of features and cost must be made.

Buildium™ and Propertyware™ both meet most of the requirements. Propertyware™ lacks only customized reporting, though it does allow for customized fields within the database that are reportable. Though this feature requirement is not met in a strict fashion, it may be a feature that is not adequately advertised. Buildium™, on the other hand, meets all requirements but charges extra fees for some services, and the direct needs of the purchaser should be accounted for to ensure no significant increased cost for these services.

In review of all available information, the recommended solution must be Propertyware™. This recommendation is made with caution as the specific needs of a property manager may very well be better met by another solution. Business owners who rely on software solutions to track legal records should both research their options themselves and consult an attorney for any legal considerations.


Buildium, LLC. (2009). Buildium online property management software. Retrieved June 12, 2009, from

Domin-8 Enterprise Solutions. (2007). Rent-right property management software. Retrieved June 12, 2009, from

Domin-8 Enterprise Solutions. (2008). TenantPro 7 property management software. Retrieved June 12, 2009, from

Gnosio. (2002a). Email from Paul T. Monson. Retrieved June 12, 2009, from

Gnosio. (2002b). Software solutions for property management. Retrieved June 12, 2009, from

O’Bannon, I. (2006, January). PropertyBoss solution [Electronic version]. The CPA Technology Advisor. Retrieved June 12, 2009, from

Propertyware, Inc. (2008). Propertyware product feature comparison. Retrieved June 12, 2009, from

Real Estate Center at Texas A&M University. (2004). Real estate software directory [Survey]. Retrieved June 12, 2009, from

Yardi Systems, Inc. (2009). Residential Voyager [Brochure]. Retrieved June 12, 2009, from

Figure 1. Comparison chart of six Property Management Software solutions indicating the two most viable options.

Comparison chart of six Property Management Software solutions indicating the two most viable options.
Comparison chart of six Property Management Software solutions indicating the two most viable options.

Defining OS

The purpose of a computer operating system is purely to allow programs to run on the computer and utilize the faculties of the hardware installed. It is no less than the soul of the machine. While a calculator only requires a simple arithmetic engine with simplified inputs and outputs, a large research mainframe requires a much more complex system of input, output, storage, memory management, and busing to connect peripheral devices.

A Brief History

In the 1950’s and ’60’s, institutions that owned computers (at the time, machines that took up large rooms) were required to write the operating system for the machine based on their needs. This was not an efficient means of programming. Every computer upgrade required rewriting the software to run it. This was very costly. Additionally, the simplistic operating system only allowed one set of operations to run at any given time which wasted resources and kept the processor time expensive, itself.

During the 1960’s, a large multi-institutional group (lead by MIT) attempted to solve this problem by creating an efficient, multi-user, timesharing system. Though they made some breakthroughs, the operating system that they designed, Multics (Multiplexed Information and Computing Service), was still bulky and inefficient. The project was soon abandoned.

A few die-hard users at Bell Labs, Inc. decided to continue the effort, and after a few years, UNIX was born. The name was an intended pun on the operating system that predated it.

UNIX is the first operating system to promote object-oriented programming and data pipes which set the standard for operating systems to come. The versatility of UNIX is apparent by its ability to command a range of devices from mainframes to microcomputers. Unix has been described as “of unusual simplicity, power, and elegance….” Its development and evolution led to a new philosophy of computing, and it has been a never-ending source of both challenges and joy to programmers around the world. (Bell Labs, n.d.)

The First UNIX Port

Just as UNIX was being tapped as a useful business tool, one of the developers on sabbatical took a teaching position at the University of California at Berkeley (UCB) where he taught classes on UNIX. Professors and students at UCB continued the development of the operating system on their own and eventually, with funding from DARPA, created the BSD operating system, ported from Bell Labs’ UNIX.

UNIX is certainly the precursor to the contemporary operating system, and though it alone proved to be a reliable, efficient and usable operating system, it is responsible for the growth of computer technology in the last four decades. It is the definition of the contemporary operating system and the standard for comparison.

UNIX has been modified to be run on mainframes, supercomputers, and microcomputers to include desktop PC (Linux) and Apple (NEXTSTEP) computer systems.

The NEXTSTEP For Apple

Apple’s Mac OSX is derived from NEXTSTEP which uses two different flavors of UNIX, Mach and BSD, and relies heavily on open-source software packages, essentially free software programs with access to the source code for user-level modification. Apple attempts to use the security and efficiency of UNIX while competing directly with Microsoft for market-share. (Singh, 2003)

The Mainstay

Currently, Microsoft Windows XP is essentially the operating system of choice for many people when it comes to desktop computing. Microsoft also has a large market-share of the server-platform operating systems. Focused on streamlining usability, Microsoft trades efficiency and security for user-friendliness. Though Microsoft has been attempting to integrate UNIX philosophies into their operating systems, it has lacked the ability to do this successfully without sacrificing its business logic (UNIX and UNIX-based operating systems rely heavily on open-source programming and the consumer for fixing and reporting on bugs, or programming errors).

The closest Microsoft has come to the integration of these philosophies is Windows Longhorn. Unfortunately, while trying to get Longhorn to market, Microsoft cut many of the UNIX-friendly features and implemented a tighter security scheme that resembles XP. This release was called Microsoft Vista. (Greene, 2004)

With these trade-offs, Microsoft actually alienated many PC users because of Vista’s obtrusive security implementation. This is a direct result of the heavy integration of Microsoft’s web-browser, Internet Explorer, into the operating system. This practice seems to go against every contemporary philosophy of what an operating system is.


Bell Labs, Inc. (n.d.), The Creation of the UNIX* Operating System. Retrieved June 10, 2009, from

Greene, J. (2004, April 19). How Microsoft Is Clipping Longhorn [Electronic version]. Business Week, p. A1.

Singh, A. (2003), What is Mac OS X? Retrieved June 10, 2009, from

Price and Performance Trends for Computer Hardware

It is important for computer professionals to be able to forecast future technology performance and pricing. In order to fulfill future purchasing and support requirements, the IT professional should be able to analyze historical data and arrive at an approximated, though accurate, figure.

Though it is beyond the scope of this paper to find an elegant solution to the problem, I will examine if it is feasible using the process of averaging the average biennial difference (aBD) and linear slope value (sV) of historical data rather than a complex statistical model.


To forecast current prices and performance of various computer hardware components, I will use historical data from O’Brien & Marakas (2007) and attempt to closely approximate current prices and technological performance growth by finding an average of the aBD and sV based on this data. I will then check current prices and compare my results with a published price list available on the internet.

Design and Procedure

This experiment will be isolated to the growth in performance and pricing as it relates to computer processing units (CPUs) (see Figure 1), random-access memory (RAM) chips (see Figure 2) and hard-disk drives (HDDs) (see Figure 3) over a 15 year period from 1991 to 2005. With biennial data, I will find the sV of each set, the aBD of the values of each set, then average the two. I will repeat this process for each component for performance growth and cost.

To find the current prices, I will use Newegg, Inc. (n.d.), known for reliable and consistent pricing, and compare the results of the historical data analysis.


Computer Processors

The performance growth analysis of CPUs shows a sV of 263.01 MHz with an aBD of 269.64 MHz, a variation of 6.63 MHz with an overall average of 266.32 MHz. The cost analysis shows a sV of $23.28 with an aBD of $26.36, a variation of $3.08 with an overall average of $24.82. Continuing this trend shows that a CPU in 2009 should have a performance of 4.8 GHz with a cost of $626.73.

The typical performance of a CPU today is 2.79 GHz with an average cost of $244.29. The price range per processor is $59.99 – 1,039.99.

A CPU with a performance speed of 4.8 GB today is currently in development and unavailable for price comparison. Though the accumulated speed of multiple core processors can achieve this speed, it is outside the scope of this paper.

Random-access Memory

The performance growth analysis of RAM chips show a sV of 103.2 MB with an aBD of 142.79 MB, a variation of 39.59 MB with an overall average of 122.99 MB. The cost analysis shows a sV of $1.14 with an aBD of $6.71, a variation of $5.58 with an overall average of $3.93. Continuing this trend shows that a RAM chip in 2009 should have a performance of 2,214.84 MB with a cost of $125.66.

The typical performance of a single RAM chip today is 2 GB with an average cost of $33.20. The price range per chip is $22.00 – 56.00.

Hard-disk Drive Storage

The performance growth analysis of HDDs show a sV of 18.82 GB with an aBD of 22.85 GB, a variation of 4.03 GB with an overall average of 20.84 GB. The cost analysis shows a sV of $-24.62 with an aBD of $-26.07, a variation of $1.45 with an overall average of $-25.35. Continuing this trend shows that a HDD in 2009 should have a performance of 375.17 GB with a cost of $23.79.

The performance of a HDD today ranges from 18 GB to 2 TB, though typically 500 GB with an average cost of $91.29 (which is also typical of drives up to 1.5 TB). The price range per 500 GB drive is $49.99 – 379.99.

A HDD with a performance capacity of 375 GB today costs approximately $50.00.


Though some of the results approximate the current technology and pricing, the data set used is too small to draw any meaningful conclusions at this time. Differences in device stability, manufacturing technologies and branding complicate this issue even further. More research needs to be done to find any meaningful correlation between historical pricing and technological performance growth with that of the future. The use of larger data sets analyzed with statistical methods may prove useful, but at this time, the use of an average of the sV of each set and the aBD of the set values does not provide any realistic outlook on technology in the future.


Newegg, Inc. (n.d.). online store. Retrieved June 5, 2009 from

O’Brien, J. A., & Marakas, G. M. (2007). Computer hardware. In S. Mattson, S. Isenberg & T. Hauger (Eds.), Introduction to information systems (14th ed.) (p. 109). New York, NY: McGraw-Hill/Irwin.

A prediction

My prediction is that Solid-state drives (SSDs) will be the effective standard for all business and personal computing platforms within ten years.

With the new SATA 3.0 specification calling for link speeds of 6Gbps (effectively providing maximum data transfer speeds of 600 MBps) and devices capable of read and write speeds from 200-240MB/sec, solid-state drives will gain huge momentum in personal computer integration. With these fast speeds, installing essential operating system objects on these drives will contribute to significant decreases in boot time without machine state state saving options like sleep-modes or hibernate-modes, coupled together leading to almost instantaneous booting of the computer.

Utilizing the same technology that made USB-key drives so popular, internal SSDs are robust and integrate easily with SATA buses which allow for maximizing bus speeds, capacity (1TB), redundant data protection (i.e. RAID) and lower power demands than the PATA predescessor (250mV vs. 5-Volt), essential in mobile computing applications.