Monday, April 19, 2010

Topic 1: E-commerce, distributed applications and the Internet


Many of the e-commerce models like B2B, B2C, C2C, E2C, E2B, and social networking require an underlying infrastructure including technologies such as:

  • Broadband Communications and Networking
  • Internet and mobile E-commerce via wireless data communications
  • Client/server or MVC architecture of web framework systems
  • Data interchange
  • Access and cryptographic security
  • Electronic payments, databases and multimedia

Some of the core e-business services that can then be provided by an organisation's e-systems architecture include:

  • Online merchant account facilities
  • Secure credit card processing and electronic payment systems
  • Custom order processing to meet your organisation's specific needs
  • Catalogue management
  • Shopping cart facilities
  • Daily site traffic and ordering statistics
  • Instant updates early
  • Ease for consumers to browse and collect prices
  • Convenient for the consumer
  • No face to face communication.
As increased broadband bandwidth helps build online business services, other influences such as the development of 10 Gigabit Ethernet and the release of Web 2.0 tools since 2005 and Web application frameworks have led to the emergence of social networks as a new business model and marketplace.

What is the big push behind E-commerce?
  • The list of advantages below may help answer this question:
  • Efficiency gains.
  • User friendly.
  • Open 24 hours a day, 7 days a week. Unlike real stores, staff need not be employed 24 hours a day for a business to receive orders and process payments.
  • Orders can be processed online in real time, or offline in batch processing.
  • Globalisation – shops are not geography constrained and can compete with national and multinational companies for consumers located anywhere in the world. It is much cheaper than opening a shop and advertising in numerous countries.
  • Provides flexibility.
  • Reduction in processing costs - especially when large volume of business occurs between certain companies.
  • Speed.
  • Market freedom.
  • New ways to fundraising activities.
  • Competitive.

However your feelings towards doing business online may be moderated when we look at some of the issues or risks involved:
  • Security.
  • Vandalism.
  • Sabotage.
  • Theft and fraud.
  • Breach of privacy or confidentiality.
  • Violations of data integrity.

Rapid evolutionary prototyping approach

A case history

In 1996, I was consulted on building an inaugural e-commerce site and intranet for a national insurance broking business, with over ninety branch offices across the nation. A lot of new Web technologies have developed since that time. We used a professional graphics designer for all logos, buttons and image maps and used PERL and CGI for all server-side interactivity. This was the beginning of a developing e-systems infrastructure for the company.

For the work breakdown structure, I used the 'builder metaphor' for developing Web applications. This is similar to a project homebuilder sub-contracting work to others, e.g. carpenter, electrician, concreter, cabinetmaker, carpet layer and painter. This approach was easier for me to handle as project manager and easier for the business client to understand the apparent costs associated with the Web site development. It was used in tandem with a Rapid Evolutionary Prototyping Approach.

The Rapid Evolutionary approach is recommended for small projects as a fluid approach consisting of two main phases:

1. evolutionary application prototyping; and

2. implementation.

After capturing the business requirements and the system specifications, the object modelling focused on what the system should do, rather than the how to. One of the attractions of this approach is that users see a visible, tangible system as construction takes place. One of the pitfalls occurs if the system becomes unstable or hard to maintain, once it is moved to the production site. Make sure that you include proper project management and quality techniques in your e-business application development.

In the final project report I recommended that any maintenance or site makeover be done with an outsourcing company. The intranet was in constant use, with some 'fine tuning' until 2001. A new site was released in early 2002 but looks vastly different in 2010. The reason the first design lasted for five-six years was that the site did what was required and the business needs were slow to change – until a corporate re-structuring.


Components of the online store

IBM (http://www.ibm.com) describes some key e-commerce Web page terms relating to online business development. While the list can be quite long, here are a few terms those IBM uses, which give you an indication of the components of online shopping as a common form of e-commerce:

order list – A list of products that the user has identified as being under consideration for purchase, common to shopping cart sites.

order list page – A page that contains the order list.

product list – A list of products in the e-catalogue. Typically, the product list contains each product's name, price, and a very brief description. It is linked to more detailed information and may also include a mechanism for adding items to the order list.

product category navigation page – A page that presents product offerings grouped by categories, such as brand or intended usage.

product description page – A page that describes a product in detail and allows the user to add the product to the order list.

store front – A point or entry to an online store. Sometimes this page is the same as the company home page (http://www.companyname.com). Other times it is separate (perhaps http://www.companyname.com/shop).

Where did you find the e-commerce model?

An organisation needs to have an e-commerce model in place as a blueprint for

the development of the whole e-systems infrastructure. The e-commerce model is an abstraction of the infrastructure to be developed and serves as a

communication mechanism for the e-business application project team. A well developed application model, using Model View Controller (MVC) and Unified Modelling Language (UML), supports traceability throughout its elements and artefacts.

Figure 1.1 shows how an e-commerce model relates to business application development and the increasing use of new web and multimedia technologies. Part of your job may be to make connections from the current e-business application (e.g. E-catalogue, Shopping Cart), the E-commerce model and the latest technologies to the E-systems Infrastructure plans.



The Internet: Architecture, protocols, standards and services

The Internet is an information pipeline, which grew out of ARPANet - a USA Department of Defense experiment. It was originally used exclusively for non- commercial (primarily academic) purposes, but the rules against commercial exploitation of the network have relaxed considerably in past years.

The Internet is a global network of networks or internetwork, which connects millions of users by packet switching technology. Packets are a constant length bit string transmitted as individual entities. Each packet sent across the Internet must follow the format of the Internet Protocol (IP), in order to distinguish between others type of data packets. IP packets are also called IP datagrams. Dedicated routers interconnect the various computer networks.

The Internet Architecture Board (IAB) is responsible for setting standards

relating to the Internet. The Internet Society (ISOC) ratifies IAB standards, which is a large body that all users of the Internet have the option of joining. Most of

the work that is carried out prior to a new standard being approved is by working parties of the Internet Engineering Task Force (IETF). Although most new standards are proposed initially be the IETF, any organisation can propose that a new protocol or technology becomes an approved standard.

To establish a new standard, you first need to submit a document as an Internet Draft. After a period of consultation with the research community you will then submit a modified version of the proposal as a Request For Comment (RFC). The IAB's RFC editor will allocate an RFC number to the proposal and it will be made available through the main RFC archive (along with mirror sites around the world). Finally, after a further period of consultation, the IAB may recommend that the RFC be submitted to the ISOC as a proposed new standard.

W3C

Standards associated with the World Wide Web are dealt with by a separate body, the World Wide Web Consortium (W3C) at http://w3.org. W3C develops interoperable technologies (specifications, guidelines, software, and tools) to lead the Web to its full potential as a forum for information, commerce, communication, and collective understanding. This body is a consortium of representatives of the main companies involved in Web development. It was created to provide a faster mechanism for new standards to be approved, as the delays involved in the ISOC process were leading to defacto standards emerging and being superseded long before they were approved by the ISOC.

HTTP and HTTPS

Tim Berners-Lee defined HTTP in 1992 as a connectionless, stateless protocol where a transaction consists of:

Connection

The establishment of a connection by the client to the server - when using TCP/IP port 80 is the well-known port, but other non-reserved ports may be specified in the URL;

Request

The sending, by the client, of a request message to the server;

Response

The sending, by the server, of a response to the client.

Close

The closing of the connection by either both party

This is what happens when you use a browser to view the page at csu.edu.au

For secure transactions with the Secure Sockets Layer (SSL), the HTTP protocol is enhanced by the use of encryption to provide a secure link. This new protocol with SSL was called HTTPS. SSL is often used to transfer credit card numbers and other sensitive information and can be configured to appear in the location box (e.g. in your CSU online Subject Outline). The browser also uses a closed lock icon to indicate a secure site. More about security later in Topic 7.

Let us start with the fundamental Internet protocol suite and work our way through to the security aspects. TCP/IP takes us on a trip back in time . . .

TCP/IP

Transmission Control Protocol/Internet Protocol was developed in the late

1960's. TCP/IP is the protocol that is used by the Internet and World Wide Web. The layers do not follow the OSI model because it preceded the OSI model by almost a decade. Most of the services that we normally associate with the Internet are delivered via TCP/IP. These services include file transfer via File Transfer Protocol (FTP), remote login via the Telnet protocol, electronic mail distribution via the Simple Mail Transfer Protocol (SMTP), and access to the Web pages via the Hypertext Transfer Protocol (HTTP).

FTP

FTP (for File Transfer Protocol), allows users to move data files from computer to computer. The vast quantity and range of resources available through FTP made

it one of the most popular features of the Internet. Much of what is available this way is software - including anti-virus utilities, printer typefaces, games, graphics and updates of commercial software. There are FTP archives of software for most computers. Books, journals, reports and other documents are available through FTP. You can find and acquire copies of the lecture notes, status reports, e.g. NASA missions, and numerous reports. A typical session involves commanding an FTP program to connect to a remote FTP host specified by its network address; moving around in the directories on the host; and requesting the system to get the desired files. Unregistered or anonymous users can use access via FTP where users identify themselves literally as anonymous.

Remote Login Telnet

Remote login, also called Telnet, allows users to connect to other computers and the services they run. One of the original ideas behind the Internet was to allow researchers to use programs and resources mounted on computers at other facilities. The Internet features tens of thousands of computers accessible via remote login. This worked fine when networks gave limited access, but now using raw telnet exposes your username and password to an 'eavesdropper'. So now we need the added protection of SSH.

SSH secure shell protocol

The need for better security than FTP or telnet for corporate network services, and the use of HTTP as an FTP alternative, has led to the use of more secure ways to remotely access a network.

One of these secure alternatives is secure shell – SSH. A freely available Telnet/SSH client can be used called PuTTY. Secure copy or PSCP is another free tool to replace an FTP client on the Win32 platform. Secure Shell is a program to log into another computer over a network, to execute commands in a remote machine, and to move files from one machine to another. It provides strong authentication and secure communications (encryption) over unsecured channels. It is intended as a replacement for telnet, rlogin, rsh, and rcp. For SSH2, there is a replacement for FTP: sftp.

Recognising the Web 2.0 and other changes since 2005

You should recognise the way in which an online distributed business application is used, as it may have various technologies gathered over time - component technologies of HTTP, HTML, CGI, XML, dynamic clients, or session management mechanisms that use cookies to index a dictionary for a shopping cart site may creep into the infrastructure without quality planning.

Recent developments with Web 2.0 tools since 2005 has spawned the growth of social networks through use of RSS and development frameworks like Ruby On Rails and the move towards the use of common business applications online by 37signals.com and google.com

That is why it is important to get the technical knowledge of competing frameworks that has been developing for some years through efforts by SUN Java Systems or Microsoft .NET, or application servers like ZOPE and other open source systems.

These application servers built upon the previous foundations laid by the earlier Web servers from Apache or Microsoft IIS, as well as scripting environments like Python, ASP.net VB script and PHP etc. An understanding of the role of XML as a core technology in developing an e- systems architecture is also useful.

Cloud and grid architecture

Cloud and grid computing offer a cost effective solution to providing business, education and other services, often called 'utility computing'. They offer a solution to the many problems dealing with how to provide services, data storage and computing power to the user without the cost of maintaining and upgrading in the organisation alone. Grid computing is a cluster of computers using a parallel processing architecture where the CPU resources are shared across the network acting as one large computer. Cloud computing is a set of dynamically scalable, virtual services over the Internet. All you need is an Internet connection! Like a black box, you need not have control the 'cloud' that provides services, such as Google Documents or the applications and services at 37signals.com

Coupled with developments in mobile devices and geo-location in high speed broadband networks and service-oriented computing, cloud and grid computing subsequently involves increased use of remote services shared by many users.

As an example, Google applications offer the common business application online approach through its Goggle Documents, Google Maps and Google Earth applications, offering APIs for others to include their services in their own Web sites or applications.

Hence a trend towards complex applications being processed in the cloud, will involve infrastructure changes to include grid computing as an extension to the use of clustered data farms with large processing power and enormous storage capacity. All this would be beyond the capability of many small to medium businesses.


0 comments:

Post a Comment