Using A Proxy To Ensure Availability Of A Service


If we wish to use dynamic web services to provide new and innovative services, then we must also be aware of the risks inherent in such an approach – a delay or failure in a service may not only cause a problem for direct users of the service.

As web services could be used to provide a variety of services that we may not be aware of, or may be part of a larger composite application, the potential for widespread indirect failures should also be considered. The aim of this project is to demonstrate the use of a proxy in order to mitigate against such risks.

It should be demonstrated whether a proxy can be used to:
• Maintain a service if the connection to the source information is severed
• Provide a stated and consistent level of response time to a request for a service
• Allow scalability to cope with increased demand without compromising the service


Unfortunately the free licences, where the examples were have expired. The link will take you to a page hosted by right scale, which shows how it works.


The precise scope of the project took a while to define, as it covers several different areas. Research into the use of caches, advises their use for more static information, (EG photographs, CSS, header & footer HTML, Downloadable forms etc.) & highlighted the tension between, performance ( which implies maximising caches) & ensuring that ata is as uptodate as possible( which implies minimal use of caches).

Another area where caching should be avoided is with confidential data, as this presents a security risk. - indeed some of the sources state that https content cannot be cached, & others say that it can't !

Sizing of the cache also needs to be carefully considered. Caches tend to operate on a first in first out basis, so that oldest item that hasn't been read is the first to be dropped. This means that a small cache, on a busy website may only store a particular item for a few moments before it is pushed out.

As, WCC's website already uses a proxy server to provide load balancing & failover, I wasn't too sure what new information this work was going to provide. I entered a bit of a cul-de-sac, when I started to create some virtual Ubuntu servers in Virtual Machines on my PC,with a view to creating a load balancer, pointing a pair of virtal apache servers, each containing a "Hello World" file with slight discrepancies, so that refreshing the URL would make it clear that the source was coming from a different 'server' ( 'Hello World' vs 'Bonjour Le Monde 'etc…). But then I realised that this approach was not radically different from our existing set up, & so would not enhance our knowledge. What was needed was a way to do this, with cloud based servers.

My next step was to search in Google for 'Scalable virtual servers'. As well as Amazon, who are quite famous players in this area, the search revealed several other companies ( , www.1&,,,, which provide this type of service.

Their specialism is to allow you to create virtual webservers, database servers, loadbalancers etc. using a browser based interface. I contacted a few of these companies to see if it was possible to set up a free demonstration of Hello World, which would allow us to easily add in "Guten Tag, Welt".

I was able to create the Hello World server It was an apache, server based in the USA, but the it had a limited lifespan. I was not able to set up the load balanced model. I am looking into trying the full test at a more persisitent URL - I have had discussions with GigaSpaces about this.
Gigaspaces sent me a licence for an hour long demonstration site. I was not able to get a working example up in this time.

In a future expansion of this, we may wish to investigate whether JA.Net also offer such a service


This had mixed success. There are cloud server providers who allow you to rapidly expand & contract the number of servers, which underpin a website. The existing suppliers in this area tend to offer a custom service for Domino, & our own domino admins advise that clusters of domino servers hit a practical limit of 6 boxes, but there are a lot of different server configurations offered onto which Domino could potentially be installed. Apache is a very common configuration. Although I was able to create a basic "hello world"example quite quickly, my licences expired before I was able to build & test a load balanced example.

One area which may offer resillience in this area would be to use different suppliers of cloud servers with a foundry switching between the two different suppliers.
There may be somemileage in the cloud based server approach for enhancing the scalability of public facing content, but the case is not yet proven. Even if further research were to prove that company "Y" could provide "X" additional servers with load balancing for within "Z" minutes, major concerns still exist over security.

Gartner highlight 7 major areas of concern which cloud computing offers over & above traditional architechtures. These are:
1. Privileged user access. Sensitive data processed outside the enterprise brings with it an inherent level of risk, because outsourced services bypass the "physical, logical and personnel controls" IT shops exert over in-house programs. Get as much information as you can about the people who manage your data. "Ask providers to supply specific information on the hiring and oversight of privileged administrators, and the controls over their access," Gartner says.

2. Regulatory compliance. Customers are ultimately responsible for the security and integrity of their own data, even when it is held by a service provider. Traditional service providers are subjected to external audits and security certifications. Cloud computing providers who refuse to undergo this scrutiny are "signaling that customers can only use them for the most trivial functions," according to Gartner.

3. Data location. When you use the cloud, you probably won't know exactly where your data is hosted. In fact, you might not even know what country it will be stored in. Ask providers if they will commit to storing and processing data in specific jurisdictions, and whether they will make a contractual commitment to obey local privacy requirements on behalf of their customers, Gartner advises.

4. Data segregation. Data in the cloud is typically in a shared environment alongside data from other customers. Encryption is effective but isn't a cure-all. "Find out what is done to segregate data at rest," Gartner advises. The cloud provider should provide evidence that encryption schemes were designed and tested by experienced specialists. "Encryption accidents can make data totally unusable, and even normal encryption can complicate availability," Gartner says.

5. Recovery. Even if you don't know where your data is, a cloud provider should tell you what will happen to your data and service in case of a disaster. "Any offering that does not replicate the data and application infrastructure across multiple sites is vulnerable to a total failure," Gartner says. Ask your provider if it has "the ability to do a complete restoration, and how long it will take."

6. Investigative support. Investigating inappropriate or illegal activity may be impossible in cloud computing, Gartner warns. "Cloud services are especially difficult to investigate, because logging and data for multiple customers may be co-located and may also be spread across an ever-changing set of hosts and data centers. If you cannot get a contractual commitment to support specific forms of investigation, along with evidence that the vendor has already successfully supported such activities, then your only safe assumption is that investigation and discovery requests will be impossible."

7. Long-term viability. Ideally, your cloud computing provider will never go broke or get acquired and swallowed up by a larger company. But you must be sure your data will remain available even after such an event. "Ask potential providers how you would get your data back and if it would be in a format that you could import into a replacement application," Gartner says.


This could provide scaleability to the school closures page.
The method which was implemented for school closures whereby the domino system updates a static HTML file every 'n' seconds, & where the public are redirected to that HTML page instead of to the domino system could be extended so that the HTML page is cloned among a variety of virtual load balanced servers which reside in the cloud


A major issue with using cloud based virtual servers is security. It is one thing to trust an external hosting company with information that is in the public domain, it is quite another to trust them with confidential, sensitive & personal information.
This consideration does present some advantages for us though, as it means that we can divide our web presence between a public read only zone (for content like 'published' web pages & vacancy lists - these should be relatively straightforward to expand using cloud based techniques), a public secure zone (for application forms containing the public's details) & a secure extranet. Each area would be clustered for resillience, & each would be tuned to the task at hand at the server level. For example, The public read only area could be set up to only grant update access to specific named ids from specific named IPs & there would be no need for a user registraion facility. The default access for unauthenticated would be read. It would also be set up to utilise cloud based clones to help with peaks in traffic. The public secure domain & secure extranet could be configured to have mandatory SSL for every connection, & to deny all access to unauthenticated users. They would not be suitable for proxies or cloud based clones. The public secure domain would have a Me@WCC type self registration service, the secure extranet wouldn't. This topology would also let you implement different branding for the different areas of the site to highlight the different functions. One issue would be that items like CSS, header footer HTML would need to be maintained in http & https versions to prevent messages like 'this page contains secure & insecure items, continue ?"


Security - A major concern with this approach is security, which operate on several levels. Personal & confidential data cannot be risked: these issues must be at the forefront of any cloud-based virtualisation strategy: it is not for me to prove that the following theoretical risks are possible, probable or even guaranteed to happen; others must prove that they are impossible.
Are the hosting organisations trustworthy?
Are they competent?
Are their products secure?
If there is a problem, what redress is there? ( The nightmare scenario is of a rubbish product, from an uhelpful company, fronted by convincing salesmen, & backed up by sharp lawyers who limit their liability……)
It is not reasonable to suppose that we could offload our legal & moral responsibilities to protect personal data to a 3rd party

Jericho Forum
They have published a white paper on secure collaboration in the cloud. They identify 4 different criteria to determine what should be on the cloud, & what shouldn't. These dimensions are described in what they call the Cube Cloud model, which cover the following areas :
External Vs Internal ( where does the data sit amazon vs WCC VM)
Proprietary Vs Open ( How easy is the solution to migrate to another provider. ?)
Perimiterised vs de-perimeterised (How far do we wish to move away from traditional controls & firewalls)
Outsourced vs Insourced - whether the service is provided by internal staff or by a 3rd party

The jericho Youtube video highlights that some services postively should not be moved out to the cloud. While they argue for an external, open, deperimiterised, outsourced model, they do acknowledge that cloud security does not exist.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License