Why Firewall Security is Necessary to Protect your Network
In your car, the firewall sits between the engine compartment and the front seat and is built to keep you from being burned by the heat of the combustion process. Your computer has a firewall, too, for much the same reason – to keep you and your data from being burned by hackers and thieves who are the unfortunate creators of “Internet combustion” and destruction.
The firewall, a “combo” approach of software that regulates and monitors hardware and communications protocols, is there to inspect network traffic and all the “packets” of information that pass through to your inner sanctum, your CPU and hard drives. A firewall will rule out the possibility of harm, or at least greatly minimize, by noting and quarantining potentially harmful “zones” and will either deny or permit access to your computer based on the current set of rules that applies at the time, depending on many (very many) factors.
Basic tasks and settings
The basic task for a firewall is to regulate of the flow of traffic between different computer networks that have different “trust levels.” The Internet is full of countless overlapping zones, some safe and some totally deadly. On the other hand, internal networks are more likely to contain a zone or zones that offer a bit more trust. Zones that are in between the two, or are hard to categorize, are sometimes referred to as “perimeter networks” or, in a bit of geek humor, Demilitarized Zones (DMZ).
Without proper configuration, a firewall can simply become another worthless tool. Standard security practices call for a "default-deny" firewall rule, meaning that the only network connections that are allowed are the ones that have been explicitly okayed, after due investigation. Unfortunately, such a setup requires detailed understanding of network applications and a great deal of time and energy to establish and administer.
Who can do what?
Many businesses and individuals lack sufficient computer and network knowledge to set up a default-deny firewall, and will therefore use a riskier but simpler "default-allow" rule, in which all traffic is permitted unless it has been specifically blocked for one of a number of possible reasons. This way of setting up a firewall makes “mysterious” and unplanned network connections possible, and the chance your system may be compromised becomes much more likely.
Firewall technology had its first growth period in the computer technology revolution of the late 1980s, when the Internet was a fairly new in terms of its global reach and connectivity options. The predecessors to today’s hardware/software hybrid firewalls were the routers used in the mid 1980s to physically separate networks from each other. However small the Internet began, it was ultimately undone by supremely fast growth and the lack of security planning, and therefore there were the inevitable breaches caused by older (“prehistoric”) firewall formats. Fortunately, computer pros learn from their errors, and the firewall technology continues improving daily.
Fundamentals of computer networks, cloud computing,wireless networking, peer to peer network,proxies
Tuesday, December 28, 2010
The Future of Dedicated Hosting Delivery
The Future of Dedicated Hosting Delivery
For all the hype, over the last few years an increasing number of businesses have started moving not just distribution but more important business processes online in earnest. The main reason this much anticipated migration has dragged its heels is that change takes time, and businesses going online are faced with hurdles of cost, complexity, resourcing, and marketing at every step of the process.
The workhorse in terms of infrastructure of this fundamental change is hosting.
As many businesses now know, hosting has a wide range of options in terms of cost and function, but it's the growth of Dedicated Hosting that has continued to gather momentum over recent years. The most interesting aspect of this growth is that indicators show that most businesses are at the bottom of the adoption curve and that the most aggressive growth is yet to come.
What customers want
What customers have wanted, but more importantly needed, over the past years has changed considerably. As businesses become leaner and headcounts shrink, so priorities and their drivers have changed. So-called "Have-to-haves" or essential requirements are the issues ones getting any traction, relegating "Nice-to-haves" to the back-burner until they either become irrelevant or are escalated for other reasons.
This phenomenon has seen companies spend less time, resources and money on their online presence than they might have.
Priorities have changed.
Issues that have re-prioritised the importance and investment in online presence and tools now include better brand awareness through greater exposure, increased distribution driving higher sales and new markets, and better processes to increase efficiency and reduce costs.
As customers realise that their commitment to their online tools needs to increase, so too does their requirement for effective development. Once the development has been defined and is nearing completion, the tool requires a means of delivery, being effective hosting. Hosting is then divided into two categories: Shared hosting (otherwise known as virtual hosting, as opposed to virtualised hosting) and dedicated hosting.
Dedicated hosting is a requirement once the environment that the developer requires becomes either more complex. or more customised than a vanilla shared hosting environment. In short, custom development requires the freedom that only a dedicated hosting environment can deliver.
How service providers are meeting customers’ needs
Dedicated hosting has traditionally been delivered by Carriers, Internet Service Providers or Hosting Providers. Of these, it has quickly become apparent that hosting, particularly dedicated hosting, is a specialisation requiring specific skills to deliver the required product offerings.
As dedicated hosting growth gathers momentum, so too does the need for fast, cost effective delivery. Until recently, delivering dedicated hosting has meant a long-winded and complex process for both service provider and customer alike, involving specifying and sourcing the right hardware, burn testing, server OS configuration, application configuration, IDC installation and connectivity configuration and finally a handover to the customer to, only then, start the process of final configuration for production rollout.
The process is long-winded, expensive and complex for all parties concerned. Issues continue for dedicated hosting servers set up this way as, when the times to upgrade disk, RAM or even the whole server, the process begins again from the start.
Virtualisation: Not as good as, better.
New virtualization technology is now set to deliver dedicated hosting in a way that not only eliminates most of the complexity for both service provider and customer alike, but introduces many additional virtualised hosting benefits that have not previously existed.
For service providers, it allows scalable, profitable and fast delivery of premium dedicated hosting.
For customers, it eliminates hardware, hardware drivers and hardware upgrades. In addition, due to the features included in some server virtualisation technology, it delivers far higher levels of availability and allows clones of production environments to be created for seamless development and rollout.
Virtualisation and virtualisation
As either a service provider or a customer, it’s important to understand that many different flavours of server virtualisation exist, bringing different price points, levels of resource control and base-OS independence. Apart from resource control and allocation, stability of, and independence from, the underlying OS is essential to realising all the available benefits of server virtualisation technology and quality virtualised hosting.
Of all the current crop of server virtualisation technology, VMware Virtual Infrastructure 3 seems to lead the market against all of the above criteria, combining the highest available resource control with elimination of hardware drivers. Infrastructure 3 also allows intelligent high-availability redistribution of VMs from failed physical servers to the remaining healthy servers in the farm.
Server virtualisation technology is set to expand its market share as it has in the wider server market – it just depends on whether virtualised hosting service providers and customers alike realise the possibilities available for premium virtualised hosting.
For all the hype, over the last few years an increasing number of businesses have started moving not just distribution but more important business processes online in earnest. The main reason this much anticipated migration has dragged its heels is that change takes time, and businesses going online are faced with hurdles of cost, complexity, resourcing, and marketing at every step of the process.
The workhorse in terms of infrastructure of this fundamental change is hosting.
As many businesses now know, hosting has a wide range of options in terms of cost and function, but it's the growth of Dedicated Hosting that has continued to gather momentum over recent years. The most interesting aspect of this growth is that indicators show that most businesses are at the bottom of the adoption curve and that the most aggressive growth is yet to come.
What customers want
What customers have wanted, but more importantly needed, over the past years has changed considerably. As businesses become leaner and headcounts shrink, so priorities and their drivers have changed. So-called "Have-to-haves" or essential requirements are the issues ones getting any traction, relegating "Nice-to-haves" to the back-burner until they either become irrelevant or are escalated for other reasons.
This phenomenon has seen companies spend less time, resources and money on their online presence than they might have.
Priorities have changed.
Issues that have re-prioritised the importance and investment in online presence and tools now include better brand awareness through greater exposure, increased distribution driving higher sales and new markets, and better processes to increase efficiency and reduce costs.
As customers realise that their commitment to their online tools needs to increase, so too does their requirement for effective development. Once the development has been defined and is nearing completion, the tool requires a means of delivery, being effective hosting. Hosting is then divided into two categories: Shared hosting (otherwise known as virtual hosting, as opposed to virtualised hosting) and dedicated hosting.
Dedicated hosting is a requirement once the environment that the developer requires becomes either more complex. or more customised than a vanilla shared hosting environment. In short, custom development requires the freedom that only a dedicated hosting environment can deliver.
How service providers are meeting customers’ needs
Dedicated hosting has traditionally been delivered by Carriers, Internet Service Providers or Hosting Providers. Of these, it has quickly become apparent that hosting, particularly dedicated hosting, is a specialisation requiring specific skills to deliver the required product offerings.
As dedicated hosting growth gathers momentum, so too does the need for fast, cost effective delivery. Until recently, delivering dedicated hosting has meant a long-winded and complex process for both service provider and customer alike, involving specifying and sourcing the right hardware, burn testing, server OS configuration, application configuration, IDC installation and connectivity configuration and finally a handover to the customer to, only then, start the process of final configuration for production rollout.
The process is long-winded, expensive and complex for all parties concerned. Issues continue for dedicated hosting servers set up this way as, when the times to upgrade disk, RAM or even the whole server, the process begins again from the start.
Virtualisation: Not as good as, better.
New virtualization technology is now set to deliver dedicated hosting in a way that not only eliminates most of the complexity for both service provider and customer alike, but introduces many additional virtualised hosting benefits that have not previously existed.
For service providers, it allows scalable, profitable and fast delivery of premium dedicated hosting.
For customers, it eliminates hardware, hardware drivers and hardware upgrades. In addition, due to the features included in some server virtualisation technology, it delivers far higher levels of availability and allows clones of production environments to be created for seamless development and rollout.
Virtualisation and virtualisation
As either a service provider or a customer, it’s important to understand that many different flavours of server virtualisation exist, bringing different price points, levels of resource control and base-OS independence. Apart from resource control and allocation, stability of, and independence from, the underlying OS is essential to realising all the available benefits of server virtualisation technology and quality virtualised hosting.
Of all the current crop of server virtualisation technology, VMware Virtual Infrastructure 3 seems to lead the market against all of the above criteria, combining the highest available resource control with elimination of hardware drivers. Infrastructure 3 also allows intelligent high-availability redistribution of VMs from failed physical servers to the remaining healthy servers in the farm.
Server virtualisation technology is set to expand its market share as it has in the wider server market – it just depends on whether virtualised hosting service providers and customers alike realise the possibilities available for premium virtualised hosting.
All You Wanted to Know About Proxies
All You Wanted to Know About Proxies
Though a common term in the computer age today, it needn’t bring out doubt in users. We all know what a proxy is, something that’s used instead of the real thing, especially when it comes to dubious deals where you send a proxy to carry out an intended act instead of the real person. Similarly in the world of servers, proxy servers are fast gaining popularity. A closed proxy can be accessed by limited people and permits using someone else's computer to conceal your identity and/or location. An open proxy refers to a proxy server that can be accessed by anyone who uses the Internet.
By and large, a proxy server permits users in a particular network group to accumulate and further internet services such as DNS or web pages. Consequently their bandwidth usage is lessened and controlled. In case of any open proxy, however, every user on the Internet can avail of this forwarding service.
The open proxies, also known as anonymous open proxies, allow users to hide their true IP address from the service they access. In fact the use of proxies has at times been intended to abuse or disrupt a service. This obviously includes breaking its terms of service or the law. For such reasons open proxies have been viewed as problematic from time to time. They have been under severe scrutiny because school children, as well as office staff have been known to use it time and again to access restricted sites during work hours.
The use of proxies can work to your advantage as it ensures that no one can take advantage of you in this visual world. This is very true in reference to users who use social forums on a daily basis. Users may be getting onto such sites to gain knowledge, learn about latest developments or also increase their social networking base, but there are always those who are on the prowl to find the vulnerable and easy targets. And just incase you’re targeted your life could be a mess. Your credit card accounts could be tampered with, someone would know all your details and you could be stalked or blackmailed and worse still you’d find yourself visiting sites and forums which you never have. You could even be made the next porn star. For your identity to be protected and enhancement of online security, the use of proxies works just fine. This permits you to access facebook, hotmail, youtube, and others without a worry. These may be popular as fun sites, but they also promote educational needs, helping you learn along the way or even have a secure social networking group you can depend on. Along the way you can make alliances that can assist you in getting a job, help make business or even make a friend. The opportunities are endless, as long as you keep yourself safe from prowlers.
A computer can be run on an open proxy server without the owner knowing this. This is possible because of misconfiguration of proxy software used in a particular computer. Malware, viruses, trojans or worms are all used for this purpose. Open proxies are slow, as slow as 14400 baud (14.4 kbit/s), or even below 300 baud. At times they could keep alternating between fast to slow every minute. PlanetLab proxies are quicker and were deliberately put in place for public use.
Despite the popularity of proxies, because of the controversies surrounding them, there are systems in place that don’t permit their usage everywhere. IRC networks routinely test client systems for identified open proxy types. Mail servers can be configured to routinely check mail senders for proxies with the help of proxycheck software. Mail servers can also consult various DNSBL servers that block spam. Such servers also list open proxies and help in blocking them.
Nevertheless, anonymous open proxies have also been hailed because they enhance ambiguity, in other words security when browsing the web or engaging in other internet services. The use of proxies conceals any users’ true IP address. This is a boon because your IP address can be used against you. It can help miscreants construe information about any user and accordingly hack into his/her computer. Open proxies are being looked upon as the next big thing when it comes to dodge efforts at Internet censorship by governments and concerned organizations. There are various web sites that make available updated lists of open proxies on an ongoing basis.
Though a common term in the computer age today, it needn’t bring out doubt in users. We all know what a proxy is, something that’s used instead of the real thing, especially when it comes to dubious deals where you send a proxy to carry out an intended act instead of the real person. Similarly in the world of servers, proxy servers are fast gaining popularity. A closed proxy can be accessed by limited people and permits using someone else's computer to conceal your identity and/or location. An open proxy refers to a proxy server that can be accessed by anyone who uses the Internet.
By and large, a proxy server permits users in a particular network group to accumulate and further internet services such as DNS or web pages. Consequently their bandwidth usage is lessened and controlled. In case of any open proxy, however, every user on the Internet can avail of this forwarding service.
The open proxies, also known as anonymous open proxies, allow users to hide their true IP address from the service they access. In fact the use of proxies has at times been intended to abuse or disrupt a service. This obviously includes breaking its terms of service or the law. For such reasons open proxies have been viewed as problematic from time to time. They have been under severe scrutiny because school children, as well as office staff have been known to use it time and again to access restricted sites during work hours.
The use of proxies can work to your advantage as it ensures that no one can take advantage of you in this visual world. This is very true in reference to users who use social forums on a daily basis. Users may be getting onto such sites to gain knowledge, learn about latest developments or also increase their social networking base, but there are always those who are on the prowl to find the vulnerable and easy targets. And just incase you’re targeted your life could be a mess. Your credit card accounts could be tampered with, someone would know all your details and you could be stalked or blackmailed and worse still you’d find yourself visiting sites and forums which you never have. You could even be made the next porn star. For your identity to be protected and enhancement of online security, the use of proxies works just fine. This permits you to access facebook, hotmail, youtube, and others without a worry. These may be popular as fun sites, but they also promote educational needs, helping you learn along the way or even have a secure social networking group you can depend on. Along the way you can make alliances that can assist you in getting a job, help make business or even make a friend. The opportunities are endless, as long as you keep yourself safe from prowlers.
A computer can be run on an open proxy server without the owner knowing this. This is possible because of misconfiguration of proxy software used in a particular computer. Malware, viruses, trojans or worms are all used for this purpose. Open proxies are slow, as slow as 14400 baud (14.4 kbit/s), or even below 300 baud. At times they could keep alternating between fast to slow every minute. PlanetLab proxies are quicker and were deliberately put in place for public use.
Despite the popularity of proxies, because of the controversies surrounding them, there are systems in place that don’t permit their usage everywhere. IRC networks routinely test client systems for identified open proxy types. Mail servers can be configured to routinely check mail senders for proxies with the help of proxycheck software. Mail servers can also consult various DNSBL servers that block spam. Such servers also list open proxies and help in blocking them.
Nevertheless, anonymous open proxies have also been hailed because they enhance ambiguity, in other words security when browsing the web or engaging in other internet services. The use of proxies conceals any users’ true IP address. This is a boon because your IP address can be used against you. It can help miscreants construe information about any user and accordingly hack into his/her computer. Open proxies are being looked upon as the next big thing when it comes to dodge efforts at Internet censorship by governments and concerned organizations. There are various web sites that make available updated lists of open proxies on an ongoing basis.
A History of Voip
A History of Voip
The use of Voip (voice over IP) is increasing rapidly year on year. It is predicted that by the end of 2009 there will be 256 million users of VOIP around the world. The advantages of VOIP in terms of scale, cost and easy of use are now commonly agreed upon. But where did VOIP begin? Who invented VOIP?
The history of Voip extends further back into the world of pre internet that most people would think. The first Voip calls where made as far back as 1973. The capability to send voice across a digital network was pioneered on the ARPANET network, the precursor to the modern Internet. It only carried data and voice between the private network of computers on the APRPANET grid but the seeds for the VOIP revolution where sown by these pioneers.
Voip continued to developed amongst a small cache of computer users who used the technology to communicate with each other in a sort of geeky version of CB radio. Any two computers connected on the same network could use voip technology but there was no widespread adoption of the technology.
The first major step towards the VoIP services that many of use to today was the introduction of the software called "Internet Phone" from a US based company called Vocaltec. The first publicly available of f the shelve internet phone software from Vocaltec was the catalyst for the explosion in VOIP use. The Vocaltec software was able to run on a home PC and utilized much of the same hardware products that Voip services use today in terms of soundcards, speakers and headsets. The Internet Phone software differed from most modern VOIP services in that it used the H.323 protocol instead of the SIP protocol that is more ubiquitous today.
Although Internet Phone was an immediate commercial success it did suffer from a variety of problems. The lack of high speed internet access meant that the quality could be poor and the flow of voice slow. Early voip calls where like using walkie-talkies to communicate in terms of quality of signal. Another issue was the fact that the two computers that where talking to each other needed to have the same soundcards with the same drivers for the software to work. This obviously limited the use of the software and the effectiveness of the process. Much of the transmission was done via modems and was therefore utilizing traditional telephone lines and providing a service that was of a worse quality to that of a normal phone call.
Once Vocaltec had laid the foundations the increase in the use of VOIP was fairly rapid accounting for 1% of all US phone calls by 1998. Other companies began to develop software for the VOIP market and also hardware in the terms of hard phone and network switches. The expansion of broadband also aided the growth of VOIP by increasing the quality of calls and reducing the latency issues that effected VOIP at the beginning. By the year 2000 VOIP calls in the US where about 3% of the total.
The popularity of VOIP has increased since the turn of the millennium and with free VOIP provider Skype currently having registered a staggering 400 million user accounts at the end of 2008. With the growing availability of VOIP services for mobile phones it looks as if the adoption of VOIP will continue to expand rapidly.
The use of Voip (voice over IP) is increasing rapidly year on year. It is predicted that by the end of 2009 there will be 256 million users of VOIP around the world. The advantages of VOIP in terms of scale, cost and easy of use are now commonly agreed upon. But where did VOIP begin? Who invented VOIP?
The history of Voip extends further back into the world of pre internet that most people would think. The first Voip calls where made as far back as 1973. The capability to send voice across a digital network was pioneered on the ARPANET network, the precursor to the modern Internet. It only carried data and voice between the private network of computers on the APRPANET grid but the seeds for the VOIP revolution where sown by these pioneers.
Voip continued to developed amongst a small cache of computer users who used the technology to communicate with each other in a sort of geeky version of CB radio. Any two computers connected on the same network could use voip technology but there was no widespread adoption of the technology.
The first major step towards the VoIP services that many of use to today was the introduction of the software called "Internet Phone" from a US based company called Vocaltec. The first publicly available of f the shelve internet phone software from Vocaltec was the catalyst for the explosion in VOIP use. The Vocaltec software was able to run on a home PC and utilized much of the same hardware products that Voip services use today in terms of soundcards, speakers and headsets. The Internet Phone software differed from most modern VOIP services in that it used the H.323 protocol instead of the SIP protocol that is more ubiquitous today.
Although Internet Phone was an immediate commercial success it did suffer from a variety of problems. The lack of high speed internet access meant that the quality could be poor and the flow of voice slow. Early voip calls where like using walkie-talkies to communicate in terms of quality of signal. Another issue was the fact that the two computers that where talking to each other needed to have the same soundcards with the same drivers for the software to work. This obviously limited the use of the software and the effectiveness of the process. Much of the transmission was done via modems and was therefore utilizing traditional telephone lines and providing a service that was of a worse quality to that of a normal phone call.
Once Vocaltec had laid the foundations the increase in the use of VOIP was fairly rapid accounting for 1% of all US phone calls by 1998. Other companies began to develop software for the VOIP market and also hardware in the terms of hard phone and network switches. The expansion of broadband also aided the growth of VOIP by increasing the quality of calls and reducing the latency issues that effected VOIP at the beginning. By the year 2000 VOIP calls in the US where about 3% of the total.
The popularity of VOIP has increased since the turn of the millennium and with free VOIP provider Skype currently having registered a staggering 400 million user accounts at the end of 2008. With the growing availability of VOIP services for mobile phones it looks as if the adoption of VOIP will continue to expand rapidly.
Subscribe to:
Comments (Atom)