Winning the Game
It’s such an important aspect of winning the game. You have to
know which side you’re on! That poor chicken keeps crossing the road probably in
attempt to answer the question! At HTML Goodies we don’t want you to be stuck in
that same quandary, so we are offering a brief explanation to help you pick
sides, to know which side your on and to change to the other when you need to.
What sides am I talking about? I’m so glad you asked. In the world of web pages
the two sides we have to worry about are the "server side" and the "client
side". Let’s take a closer look.
It’s worth taking a quick look at the evolution of computers to provide a
background against which we can see the current state of things. In the early
days of business computing the method was to collect information together, punch
it into cards and feed it into the first of a series of programs that would run
on the computer. This program would stage the data for the next program, and so
on through the system of programs. This method is called batch processing.
With the advent of CRT terminals it became possible to input one transaction
into a program at a time. IBM dubbed this type of computing "teleprocessing".
While it got away from the need to gather data together, typically for a day’s
worth of business, and then wait for the nightly process to update files, it
still left all the computing in a centralized "mainframe" computer system. This
limited the geographical area that could reasonably be served by the computer.
Transmission of data over distance was both slow and expensive. Then along came
the Personal Computer and changed the rules completely.
In between the "heavy iron" mainframe computers and the PCs were mini computers
that enabled the idea of distributed computing. This is where information is
"pre-processed" in local areas providing preliminary and local results back
quickly; then sent to central systems for consolidation and full processing.
Consolidated results can then be distributed as needed. PCs got more powerful
and started to close the gaps. With local area networking came the ability to
have files reside on one computer that could be accessed by many PCs. The
computer housing the file became known as the "server" computer, while the PC
accessing the file was the "client". Add processing power to the server, such as
a database management system, and we have what is known as client-server
computing. As the capabilities at both the client end and at the server end
increase, so the power and advantage of client-server computing increases. The
servers become server farms, or networks of connected server farms; and
eventually, internetworked servers and server farms and the internet.
The Game on the Web
Then there’s the web. Perfectly described as a client-server application the
World Wide Web comprises "pages", which are files residing on servers, accessed
by client applications (browsers) on client computers.
Well that’s a nice piece of history, but what’s the point? Again, I’m so glad
you asked! The importance of this to you as a web developer is to know what data
you can manipulate and when. If you want to manipulate data that is in databases
on the server, then you need to use server-side technology. The web pages
themselves also reside on the server, and so are also manipulated by server side
technology. Once the page has been displayed on the client machine you need
client side technology to manipulate data associated with it. For example, if
you have displayed a form on which your site visitor is entering data and you
wish to validate that data, allowing the visitor to make corrections, before
sending the data back to the server, then all that validation must be performed
by client side technology.
In Active Server Pages (ASP – a server side technology) you may write some code
using VBScript. This code will run on the server, not on the client. Similarly
if you write PHP code, perhaps extracting data from a MySQL database, you are
writing server side code. On the other hand, if you write some JavaScript, it is
included with the HTML that is the web page and is sent down to the client
computer for processing. The HTML code in the page and the JavaScript are both
examples of client side technologies. The question sometimes come up asking why
fields used in VBScript can not also be referenced in JavaScript. The answer is
that they exist at a different time and in a different computer. To communicate
between server side and client side technologies, data must be transmitted back
and forth between the server and client computers.
Until recently there have been many more options for server side technology than
client side — as far as the web is concerned. Although JavaScript is great for
many things, and in the hands of some creative programmers can accomplish some
remarkable feats, it doesn’t fill the developer’s desire for ever more demanding
applications. There’s always ActiveX and of course all the other flavors of
downloadable pluggable code, but these can create some security related
difficulties. Looking at this situation, and wanting a means to span the
client-server chasm, inspired Microsoft to come up with a new framework.
Building on existing technologies, adding mechanisms to assure security at the
client end and providing a means to use a large number of programming languages,
the Dot Net Architecture enables a new level of client-server web computing.
Using DNA you can switch sides without having to switch languages.
By the way, I just saw a chicken standing in the middle of the road looking
first to one side, then to the other. She had a sly grin on her beak!