Saturday 28 January 2012

Maxis trying to stay one step ahead

ALTHOUGH the LTE/4G spectrum will only be available for use from 2013 onwards, Maxis Bhd has been gearing up to prepare for the migration in future.
“We are always ahead of the curve. We are the first to launch HSDPA or HSPA+ network. We want to make sure we're the first to do it well. We're well prepared for LTE to provide consumers with an even better experience,” Maxis CEO Sandip Das (pic) tells StarBizWeek.
However, he says the whole ecosystem for the migration from the current 3G to 4G may take a longer time than the infrastructure upgrade. “In 2005, we launched 3G but some 65% of customers still don't use a 3G phone. The whole ecosystem will take some time. Some customers are reluctant to move to new technology,” Das explains.
Last year, the Malaysian Communications and Multimedia Commission named nine companies as recipients of the 2.6G spectrum, which is mainly reserved for 4G services.
All the nine telecoms players have been allocated spectrum blocks for the 2.6Ghz although in smaller chunks than originally announced.
Maxis joint chief operating officer Mark Dioguardi says Maxis has been conducting preparatory work for its new network and 4G was going to be more efficient in delivery.
Das says telcos have just completed their 3G infrastructure and the migration to 4G would mean additional capital investments, thus he proposed telcos share their network with each other.
“We have a limited population, geographical... why duplicate or triplicate the network infrastructure,” he asks. He says incumbents including Maxis would have to swallow its ego and be rational by sharing its radio network in future.
Das said that besides monetising its network and bringing forward return on capital expenditure, the sharing of network in turn helps telcos not duplicate infrastructure and provides savings for enhancing customer service.
The need for data has also sped up the trend of collaboration between the players with several partnerships taking place this year, such as the network collaboration agreement between Celcom and DiGi.
Last October, Maxis and U Mobile announced they have entered into a multi-billion ringgit agreement to share Maxis' 3G radio access networks. (RAN), making it the first active 3G RAN sharing arrangement deployed in Malaysia.
The agreement, which will be for an initial period of 10 years, will also encompass LTE sharing when the spectrum becomes available and the technology is rolled out.
Das says Maxis expects at least RM1bil in revenue in the first five years of its 10-year agreement with U Mobile.
“This sharing will enable U Mobile to get to the market three to four years earlier compared with building the network on its own,” Das says, adding that Maxis was open to any form of sharing as it would allow the telco to monetise its unutilised network.
Dioguardi concurs that telcos should put aside their egos and share their network as it will benefit users. “It is already happening now. We should stop dragging our feet. We will be gradually ramping up sharing on our network,” he says. He says U Mobile would be contributing all its current and future spectrum and it goes into a pool.
“We have already build a large part. We will have a framework where Maxis will build the network and U Mobile will lease it. This way, the network can be optimised for better cost effectiveness,” Dioguardi says.
Separately, Das says Maxis' integration story was not over yet. “We have just put the building blocks in place. We are keeping pace with our growing customers. So, it is not an option for us. It is the way our consumer wants us to be and that's the way we are expanding our business.
“Consumers are expecting more and more from us. Five years from now, they will just not be a voice customer and will want more services. And we will have to provide it for them. There will be many of instances where there are different life cycles,” Das says.
He says the measure now for telcos was no longer what a customers needs from a certain business but what share of a customer's communication expenses does Maxis get.
“I may have had 70% of your mobile expenses but now 30% of your non-mobile expenses may come along with the extra services. So I am getting an additional share. Therefore we have to see it in the context of business expanding based on consumers requirement and, business measures changing from a mix of businesses which we can get revenue from,” Das says.
“It is very critical how we are going to deal with data explosion. Four to five years ago, 25% to 30% of our network was consumed by data but now it has reached 75% to 80%,” Das says, adding that consumer expectations on the Internet have also changed.
Maxis, he says, is making sure it can provide a better experience for its customers. “You may have the coverage but at the same time we need to ensure we have the capacity. Those are the investment we have to make,” Das says.
Separately, Maxis has also been stepping up its product offerings to include latest major devices such as the iPhone, BlackBerry or Samsung Galaxy S2. The recent launch of iPhone 4S saw a crowd of at least a thousand queuing up for their phones.
“If the product is good, it'll be an instant hit. You don't have to wait months for it to happen,” Das says,
However, he is quick to clarify that Maxis was not a phone distributor but a service provider and distributing a handset was not its core business. “We want to encourage people to be on our system. At one point, Maxis was the only service provider offering the iPhone and people will have to come to get it,” he says.
Das notes that the practice of bundling of phone with airtime is not something new to the telecom industry and in certain cases, it comes with reasonable data contract.
“We're not making margin from the phone. Affordability is a problem for some customers to own a smartphone, so we ensure consumers can afford these phones. We bring forward data revenue and any subsidy for the handset is really like a data subscriber addition cost to us. We are bundling the phone for users so they can afford it. But we're not a phone reseller,” he says.
Das says: “Yes, no doubt there's a long-term contract because there's subsidy for consumers. For some contracts, the phone is free. The subsidy is the cost for us to acquire a data subscriber. We get people to stay with Maxis for a longer period. At the same time, we could convert low data user to a strong data user.”
Das also addresses criticism that Maxis packages or services are more expensive than that of other telcos.
“We were tackling it in two ways. One is to put out packages in a manner that provides as much transparency as possible and that there's no opportunity for a bill shock. Don't create an opportunity for a bill shock in the first place. The first part we are doing now involves fresh plans. The second part is what we want to do with existing plans to see how we can improve on it.
“Let me state this, we are not in the business of overcharging our customers,” Das says.
He says consumers were less forgiving to Maxis than anybody else but Maxis has set the stage for that. “We build the reputation for consumers, so they are harsher towards us than anybody else. I think people are generally more critical towards us ... which is okay. We set those bars.”
Maxis, he says, is trying to provide more clarity and information for consumers as well as providing education for consumers on what will cost them and what will not.
“We are re-doing our packages so that high users can work within their budgets. We also have a notification system for them to alert them on their usage and what their bill currently is at.
“All these are part of growing up in the data space. We are very clear that we never overcharge our customers,” Das stresses.
On eroding margins, Das says that at this point in time, margins for voice are under pressure while margins on data are at an infancy stage.

Managing Risk in Information Technology

As information technology increasingly falls within the scope of corporate governance, so management must increasingly focus on the management of risk to the achievement of its business objectives.

There are two fundamental components of effective management of risk in information and information technology: the first relates to an organization's strategic deployment of information technology in order to achieve its corporate goals, the second relates to risks to those assets themselves. IT systems usually represent significant investments of financial and executive resources. The way in which they are planned, managed and measured should therefore be a key management accountability, as should the way in which risks associated with information assets themselves are managed.
Clearly, well managed information technology is a business enabler. Every deployment of information technology brings with it immediate risks to the organization and, therefore, every director or executive who deploys, or manager who makes any use of, information technology needs to understand these risks and the steps that should be taken to counter them.
ITIL, the Information Technology Infrastructure Library, has long provided an extensive collection of best practice IT management processes and guidance. In spite of an extensive range of practitioner-orientated certified qualifications, it is not possible for any organization to prove - to its management, let alone an external third party - that it has taken the risk-reduction step of implementing best practice.
More than that, ITIL is particularly weak where information security management is concerned - the ITIL book on information security really does no more than refer to a now very out-of-date version of ISO 17799, the information security code of practice.
The emergence of the international IT Service Management ISO 27001 and Information Security Management (ISO20000) standards changes all this. They make it possible for organizations that have successfully implemented an ITIL environment to be externally certificated as having information security and IT service management processes that meet an international standard; organizations that demonstrate - to customers and potential customers - the quality and security of their IT services and information security processes achieve significant competitive advantages.
Information Security Risk
The value of an independent information security standard may be more immediately obvious to the ITIL practitioner than an IT service management one. The proliferation of increasingly complex, sophisticated and global threats to information security, in combination with the compliance requirements of a flood of computer- and privacy-related regulation around the world, is driving organizations to take a more strategic view of information security. It has become clear that hardware-, software- or vendor-driven solutions to individual information security challenges are, on their own, dangerously inadequate. ISO/IEC 27001 (what was BS7799) helps organizations make the step to sytematically managing and controlling risk to their information assets.
IT Process Risk
IT must be managed systematically to support the organization in achieving its business objectives, or it will disrupt business processes and undermine business activity. IT management, of course, has its own processes - and many of these processes are common across organizations of all sizes and in many sectors. Processes deployed to manage the IT organization itself need both to be effective and to ensure that the IT organization delivers against business needs. IT service management is a concept that embraces the notion that the IT organization (known, in ISO/IEC 20000 as in ITIL, as the "service provider") exists to deliver services to business users, in line with business needs, and to ensure the most cost-effective use of IT assets within that overall context. ITIL, the IT Infrastructure Library, emerged as a collection of best practices that could be used in various organizations. ISO/IEC 20000, the IT service management standard, provides a best-practice specification that sits on top of the ITIL.
Regulatory and Compliance Risk
All organizations are subject to a range of information-related national and international legislation and regulatory requirements. These range from broad corporate governance guidelines to the detailed requirements of specific regulations. UK organizations are subject to some, or all, of:
  • Combined Code and Turnbull Guidance (UK)
  • Basel2
  • EU data protection, privacy regimes
  • Sectoral regulation: FSA (1) , MiFID (2) , AML (3)
  • Human Rights Act, Regulatation of Investigatory Powers Act
  • Computer misuse regulation
Those organizations with US operations may also be subject to US regulations such as Sarbanes Oxley and SEC regulations, as well as sectoral regulation such as GLBA (4), HIPAA (5) and USA PATRIOT Act. Most organizations are possibly also subject to US state laws that appear to have wider applicability, including SB 1386 (California Information Practice Act) and OPPA (6) . Compliance depends as much on information security as on IT processes and services.
Many of these regulations have emerged only recently and most have not yet been adequately tested in the courts. There has been no co-ordinated national or international effort to ensure that many of these regulations - particularly those around personal privacy and data protection - are effectively co-ordinated. As a result, there are overlaps and conflicts between many of these regulations and, while this is of little importance to organizations trading exclusively within one jurisdiction, the reality is that many enterprises today are trading on an international basis, particularly if they have a website or are connected to the Internet.
Management Systems
A management system is a formal, organized approach used by an organization to manage one or more components of their business, including quality, the environment and occupational health and safety, information security and IT service management. Most organizations - particularly younger, less mature ones, have some form of management system in place, even if they're not aware of it. More developed organizations use formal management systems which they have certified by a third party for conformance to a management system standard. Organizations that use formal management systems today include corporations, medium- and small-sized businesses, government agencies, and non-governmental organizations (NGOs).
Standards and Certifications
Formal standards provide a specification against which aspects of an organization's management sytsem can be independently audited by an accredited certification body and, if the management system is found to conform to the specification, the organization can be issued with a formal certificate confirming this. Organizations that are certificated to ISO 9000 will already be familiar with the certification process.
Integrated Management Systems
Organizations can choose to certify their management systems to more than one standard. This enables them to integrate the processes that are common - management review, corrective and preventative action, control of documents and records, and internal quality audits - to each of the standards in which they are interested. There is already an alignment of clauses in ISO 9000, ISO 14001 (the environmental management system standard) and OHSAS 18001 (the health and safety management standard) that supports this integration, and which enables organizations to benefit from lower cost initial audits, fewer surveillance visits and which, most importantly, allows organizations to 'join up' their management systems.
The emergence of these international standards now enables organizations to develop an integrated IT management system that is capable of multiple certification and of external, third party audit, while drawing simultaneously on the deeper best-practice contained in ITIL. This is a huge step forward for the ITIL world.
(1) Financial Services Authority
(2) Markets in Financial Instruments Directive
(3) Anti-money laundering regulations
(4) Gramm-Leach-Bliley Act
(5) Health Insurance Portability and Accountability Act
(6) Online Personal Privacy Act

Article Source: http://www.article99.com

Understanding and Using IT Service Management

'ITIL' is a term that is fast gaining currency around the IT world. It is often wrongly described as 'IT governance' - in fact, on its own, it certainly isn't this. ITIL is a collection of best practices that helps companies implement an IT Service Management culture. However, its growing popularity reflects the substantial impact it can make on a company's IT and business performance and the fact that, in combination with other frameworks, it is a vital ingredient in creating true IT governance.
What is IT Service Management?
Today's businesses are increasingly delivered or enabled using information technology. Business and IT management need guidance and support on how to manage the IT infrastructure in order to cost-effectively improve functionality and quality. IT Service Management is a concept that deals with how to define and deliver that guidance and support. In common with other modern management practice, it views things from the customer's perspective, i.e. IT is a service that the customer or consumer receives. It can be made up of hardware, software and communications facilities, but the customer perceives it as a self-contained, coherent entity.
So what is ITIL?
Standing for 'IT Infrastructure Library', ITIL is a set of best practices that are at the heart of the IT Service Management approach. It provides guidance on how to manage IT infrastructure so as to streamline IT services in line with business expectations. ITIL is a best practice framework, presenting the consolidated experience of organisations worldwide on how best to manage IT services to meet business expectations.
ITIL was originally developed during the 1980s by the UK's Central Computer and Technology Agency (CCTA), a government body, which created ITIL version 1 as an approach to incorporating various vendor technologies and serving organisations with differing technical and business needs. CCTA has now become part of the Office of Government Commerce (OGC), which, as official publisher of the ITIL library, updated it, published version 2 and continues to develop and support it.
ITIL has since become widely adopted across the world in both public and private sectors and is recognised as best practice, being deployed in organisations of all shapes and sizes.
What makes up the ITIL Library?
ITIL documentation consists of seven 'sets' or 'volumes': Service Support, Service Delivery, ICT Infrastructure Management, Security Management, Planning to Implement Service Management, The Business Perspective and Applications Management.
Of these, Service Support, Service Delivery and Security Management are considered the central components of the ITIL framework, covering vital issues such as Incident Management, Configuration Management, Change Management, IT Service Continuity Management, Availability Management and IT Security Management.
Learning about ITIL
The seven ITIL volumes are published by The Stationery Office, the official publisher of the UK government, and are available from http:// www.itgovernance.co.uk/catalog/23 . In addition, to gain an overview and a sense of how to navigate these, it is helpful to consult one of several recommended introductory texts. 'Foundations of IT Service Management Based on ITIL - An Introduction' is widely accepted as the best starting point and self-study guide. 'Implementing Service and Support Management Processes - A Practical Guide' is a thorough and comprehensive handbook on the subject, while the 'itSMF Pocket Guides' provide a good overview of each of the ITIL components. These supporting texts may be obtained at http:// www.itgovernance.co.uk/catalog/7.
Getting certified
Part of the reason for the recent growth in ITIL awareness is the publication in December 2005 of a new global standard to which businesses can become certified. ISO 20000 (or ISO/IEC 20000:2005, to give it its correct name) is closely based upon the pre-existing British standard BS15000 - in fact, it is virtually indistinguishable. The standard comprises two parts: ISO/IEC 20000-1 is the specification for IT Service Management against which an organisation's practices can be certified; ISO/IEC 20000-2 is the 'code of practice' that describes best practices and the requirements of Part 1.
BS15000 has become widely used around the world since it was published in 2003 and was adopted virtually unchanged as the national standard in Australia and South Africa. A number of companies across the USA, Europe and Asia have already become certified as BS 15000 compliant. We also recommend several excellent books that provide guidance on achieving BS15000/ISO 20000 compliance.
Upon the publication of ISO 20000, BS15000 was withdrawn and individual standards and certification bodies are drawing up their own formal transition programmes for conversion to the new standard. Companies already holding BS15000 should encounter no difficulty in converting their certification to the new standard, as this should be one of the considerations addressed by the individual certifying bodies.
Practitioners can also pursue a structured programme of ITIL examination and certification, comprising the ITIL Foundation Certificate, ITIL Practitioners Certificate and ITIL Managers Certificate. Examinations and certification in Europe are managed through two independent bodies: EXIN, the European Examination Institute for Information Science; and ISEB, the Information Systems Examination Board. Between them, these two organisations control the entire certification scheme. In the United States, HDI is a principal organiser of examination and certification, and it and similar organisations provide coverage elsewhere around the world. These organisations ensure that personal certification is fair, honest and independent of the organisations that provide the training, and accredit training suppliers to bring about a consistent quality of course delivery.
ITIL and IT Governance
When combined with certain other frameworks, ITIL makes a major contribution to the creation of effective IT governance. ITIL processes can be mapped to CobiT (Control Objectives for Information and Related Technology) processes, and the two frameworks complement each other nicely: if the CobiT control framework tells the organisation 'what' to do in the delivery and support areas, ITIL best practices help the organisation define 'how' to deliver these requirements. Similarly, ITIL works very effectively with ISO 17799, the international code of best practice for information security, providing guidance on how to manage the various processes that ISO 17799 prescribes.
By drawing upon these three complementary frameworks as appropriate to its needs, an organisation can establish an IT governance regime that delivers real and lasting competitive advantage to its business. © Copyright 2006, Alan Calder

IT Project Governance And Prince2 Project Management: How To Keep Major IT Investments On The Rails

In today's fast-changing information economy, IT project governance (http://www.itgovernance.co.uk/page.proj_gov) has emerged as one of the most vital corporate responsibilities. The relentless pressure to innovate whilst simultaneously driving down costs means that organisations are increasingly 'betting the farm' on the successful development and deployment of new IT systems. However, the business environment now evolves so quickly that the original assumptions on which projects were based can often become fatally undermined prior to the projects' completion. With technology at the heart of most businesses, the ability to maintain tight executive and board control over such projects throughout their lifecycle has become a deciding factor in determining which businesses thrive and which founder. In response to this challenge, Prince2 project management has emerged as the world's leading methodology for ensuring that IT projects stay on track and deliver real value.
No large scale or business critical project should ever be managed on a standalone basis. The need to involve and secure buy-in from functions right across the organisation means that a project governance approach is essential. While project management is the key discipline within this, project governance is broader in scope and has six interlinked objectives:
  1. Ensuring real business value through project and business alignment.
  2. Controlling costs through centralisation.
  3. Maximising resource allocation, particularly of high value resources.
  4. Risk management through portfolio balancing.
  5. Uniform application of best practice.
  6. Organisational coherence.
IT decisions expose an organisation to significant risks - financial, operational and competitive - so it is essential that project governance be a concern for the board as a whole, rather than any one individual. The board must insist that project risks are assessed within the organisation's strategic planning and risk management framework and ensure that the right investment and management decisions are made, so that competitive advantage can be enhanced and measurable business value delivered.
The board's project governance responsibilities can be summarised as follows:
  •  To approve product initiation, manage the project portfolio and pull the plug on any underperforming projects.
  • To make one or more non-executive board members specifically responsible for overseeing project governance. They must have independent and informed oversight of progress on all business IT projects - including attending program (or large project) board meetings.
  •  To ensure clear accountability at all levels, with detailed, rigorously tested project plans based on a critical path analysis with clearly identified critical success factors, regular milestones and 'go/no go' checkpoints.
  •  To ensure that every project proposal contains a full business case with a fully costed estimate that can stand up to independent audit, with clearly stated assumptions that can withstand rigorous analysis.
  •  To manage all IT related projects as part of a portfolio.
  •  To adopt and deploy a recognised project management methodology.
  •  To adopt a clearly defined risk management plan at programme and project level that reflects corporate level risk treatment requirements.
  •  To institute a monitoring framework to inform the board of progress and provide an early alert of divergence or slippage in any of the critical success factors.
  •  To commit funding only on a phased basis.
  •  To ensure that internal audit is capable and accountable directly to the board for providing regular, timely and unambiguous reports on project progress, slippage, budget, requirements specification and quality requirements. Where there is project divergence the board should not release further funds until the cause of the divergence has been fully dealt with.
In selecting a project management methodology the organisation needs to choose an approach that is appropriate to its project objectives and development environment. By far the most popular methodology is Prince2, the successor to PRINCE (\'Projects in Controlled Environments\'), which was developed by the UK Office of Government Commerce. While PRINCE was originally developed for IT projects, Prince2 project management has incorporated substantial feedback and is now a generic, best-practice approach for all types of projects. Since its introduction in 1989, Prince2 project management has become widely used in both the public and private sectors and is now a de facto global standard.
Prince2 project management uses a structured methodology, which means managing a project in a logical and organized way, following clearly defined steps and well-understood roles and responsibilities. It perfectly matches the requirements of a project governance regime by delivering the following attributes to any project:
  •  A controlled and organised start, middle and end
  •  Regular reviews of progress against plan and against the business case

  • Flexible decision points
  •  Automatic management control of any deviations from the plan
  •  The involvement of management and stakeholders at the right time and in the right place during the project
  •  Good communications channels between the project, project management, and the rest of the organisation.
The effectiveness of Prince2 project management results from its four cornerstones, which define what a successfully managed project should be:
Planned: Prince2 has a series of processes that cover all of the activities needed on a project from starting up to closing down. This process-based approach provides an easily tailored and scaleable method for the management of all types of project. Each process is defined with its key inputs and outputs together with the specific objectives to be achieved and activities to be carried out.
Controlled: Prince2 project management divides a project into manageable stages, enabling efficient control of resources and regular progress monitoring throughout. The various roles and responsibilities for managing a project are fully described and are adaptable to suit the size and complexity of the project, and the skills of the organisation.
Results-driven: Project planning using Prince2 is product-based, which means the project plans are actually focused on delivering results and are not simply about planning when the various activities on the project will be done.
Measured: Any project using Prince2 is driven by the business case, which describes the organisation's justification, commitment and rationale for the deliverables or outcome. The business case is regularly reviewed during the project to ensure the business objectives, which often change during the lifecycle of the project, are still being met.
There are clear reasons why Prince2 project management has become the world's leading methodology. In addition to its best practice approach for the management of all project types, around 800 people per week take Prince2 project management examinations, with all training is carried out by accredited organisations. It is widely used and popular in both public and private sectors, and can easily be tailored to all varieties of projects in many different markets and businesses. For any organization that is serious about managing its IT investment, Prince2 project management is the natural choice.
© Copyright 2006, Alan Calder

Is IT the New "Utility"? You Better Believe It!

Walk into your office sometime and flip the light switch. If your lights do NOT come on, you'll surely be unhappy, upset, frustrated and ready to chew your electric utility's head off. Worst of all, you may be facing the loss of a serious amount of revenue. Your business can simply not carry on without electricity and lights.

Does your Information Technology (IT) share this exalted "essential" status? If not, you're risking serious business damage as IT has now joined electricity, heat, and air conditioning as utilities that we cannot afford to be without.

Just about everywhere, at the very least from an expectation standpoint, your network infrastructure must absolutely be up and running every minute of every day, keeping your Internet access open, your phone system reliable and clear, your fax machines ready to give and receive, and a host of other mission-critical applications functioning at peak efficiency. Should your IT systems break down, you, your company, your people and your customers cannot hope to continue functioning properly together.

CIO Magazine Editor Richard Pastore understands this new reality. He recently wrote a piece touting "convergence" as the new CIO buzzword, a term that replaces the previous buzz-mantra of "alignment." Convergence, Pastore explains, now transcends alignment in that IT can no longer be considered a separate, side entity just trying to fit in, but instead has grown integral to every aspect, every department, and every strategic goal of your business. Networked technology is now tightly woven into the very fabric of every company.

What is the significance and breadth of this new "utility" mindset? For one thing, your professional workers must have high speed Internet in their homes and expect as well to seamlessly connect to the office at any time. This includes large corporations and mid- and small-size businesses alike. For another, it means your customers are never cut off from you, that "technical difficulties" never interrupt your services. Otherwise, calamities could be lurking, ready to strike.

Take the case of private training company whose sales depended 95% upon massive advertising campaigns conducted every weekend. Its CEO knew that his firm's network of IP phones and Internet Servers had to be ready to go for the Monday morning load--or else! Before he adopted the IT-as-utility model, he had to endure occasional Monday morning breakdowns, resulting in losses of $50K per HOUR!

When he finally got religion, and revamped his systems by replacing reactive procedures with pro-active monitoring, such beginning-of-the-week traumas came to an end. Now a team of experts closely monitored all his critical devices throughout each weekend so that come Sunday night he could be assured that they would be fully functioning the next morning when his customers, new and old, began calling or hitting the company website. That's what is meant by "utility" service.

What does it take to make your network a continuously fully functional, expeditious, and a safe "utility" network? It first takes a solid commitment from the company's senior executive team. It means ending the mindset that IT is just a secondary function, that tech troubles will occur from time to time, that effective strategic management is directed toward non-technical matters (finance, product development, operations).

Second, it means investing in your firm with wisdom and confidence. Investing with wisdom means engaging a team of experts from your staff, or a consulting firm or outside vendor, to be sure that your business objectives are in line with current technologies. Someone has got to do some research in order to cut through marketing fluff and determine what is real and practical and what will drive your businesses today.

Investing with confidence means that once you have the roadmap drawn up, you have the right people designing the integration of that technology into your existing network. This design must take the "utility" approach in that the best products are chosen and configured for full redundancy and that every possible security risk is considered and mitigated.

Third, install a professional project manager to manage the implementation. True project management here coordinates all the necessary resources, assures timely product delivery, effectively plans and communicates necessary down time, and assures proper documentation. The project plan holds people accountable at every stage of the process to implement all necessary work for meeting the project's milestones and deadlines. It assures that they swiftly and effectively complete the implementation, then provide the necessary detailed documentation in order to facilitate both pro-active as well as reactive support.

Finally, the transition from implementation to ongoing support must be immediate and painless. Your project manager will clearly define all the necessary steps for making this transition as smooth as possible. In your firm's "pre-utility" days, ongoing support would mean: "You have a problem, go call IT." Obviously, a terribly reactive, and dismissive, frame of mind. Now, pro-active end-user training engages all users in the success of the transition.

To qualify as a "utility," your IT systems must be viewed in an entirely new light. For your network to truly become what it needs to be, traditional re-active, half-hearted support must become a thing of the past. Imagine your entire workforce, for example, waiting around to get your firm's email system back up. Imagine this takes a few hours. Do a quick calculation in your head of the potential cost of all this waiting around. Without a utility frame of mind, that image of horror you come up with is a nightmare waiting to happen. © Copyright 2007, Charles L. Nault

Getting Started In Information Technology Computer Consulting

AdChoices
One of the best things about being in the Information Technology industry is consulting. For purposes of this article I'm using the term consulting in reference to side jobs or moonlighting work. While full time Computer Consultants can also benefit from the tips in this article, I'm really writing at the IT Employee who works a full-time IT job and then takes extra jobs for extra money on the side.

So your working your regular job and you want to earn more working for yourself. Here's a few ideas to get started. First realize that its now almost impossible to function these days without a computer in your home. In fact many homes now have 2-3 PC's and eventually they are going to break or will need to be hooked together.


Word of Mouth Is King

To start, get yourself some business cards that explain your services. Do not list your rate I made this rookie mistake and was tied to my lowball rate once I was more established. So start by spreading the word at work. Hopefully your employer is tolerant of this. To know the limits simply ask someone in HR if the company has a policy regarding work outside of the job. If not you may be able to post a notice in the lunchroom or company classified ad board. However if this is not an option just spread the word among co-workers you trust. Word of mouth is always the best way to bring in new business. Everyone knows someone with a broken computer and you just need to get people talking. Once you get an opportunity, provide more service than the customer expects. Remember these initial jobs are seeds so even if you don't make a profit, the goodwill you earn will keep you working down the road.

The golden ring in doing this is to find someone who will recommend you to a small or medium size business that does not have its own IT staff.

Why Businesses? Because its steady work and businesses know that time is money. Businesses tend to pay on-time, they don't keep junk on their systems, and if a job runs over the amount of time you expect they are generally willing to keep the clock running so long as their systems are fixed. Home clients on the other hand tie the money for the job with the price of the PC. This works against us as PC's become cheaper. Businesses assign a monetary value to their time and data so these are easier clients to work with. They also view hiring you as just another cost of business and will not hesitate to let any employee call you in after you gain their trust.


Advertising

I've tried advertising in newspapers and never found it to pay off. One of the best things I've done besides word of mouth is to use my neighborhood. I put a flyer in each newspaper box advertising my services. The target here is the person who works out of a home office. This is another attempt to secure a client whose time is money. From here apply the same principle of outperforming their expectations. Let them know you appreciate referrals and provide them with plenty of extra business cards. I once was hired to separate two businesses during a purchase. While one half was my client I made sure the other business owner knew the level of my service and went out of my way to ensure his systems worked as well or better once I left. Of course I taped my card to each of his servers. A better way than walking your neighborhood is to obtain a list of the addresses in your neighborhood and visit http://www.usps.com and start a mailing campaign. Select the postcard mailing option and simply upload your flyer, send them your list of addresses and enter your credit card number. I've found that I can canvass a 300 house neighborhood for about $40.00 - $50.00 much cheaper and more targeted then my other attempts. The reason you want to use a postcard is two fold. 1) It's cheaper 2) Its easy to hang on to. When I used 8 1/2 x 11 paper flyers I only could reach those with an immediate computer problem. Everyone else simply tossed the ad. The idea it to get them to keep your card for later so offer an incentive to this. Give them $10.00 off their first job or offer a free consultation. You want them calling you not the other guy.


Billing

Ok here's my take on billing and getting paid. Judgment is the key. When you bill a business be sure to add to your invoice that payment is due upon receipt. Does this mean you'll get paid immediately? Nope but if you leave it out businesses will assume a Net 30 approach and pay you 30 days after receipt and that's no good. So put the payment due upon receipt and see what happens. I give them 30 days anyway before sending a second invoice with a clear notice that this is a PAST DUE invoice. Most times this clears things up. Now I should add that I do have some customers that are inconsistent about how long it takes to get paid but they do pay and furthermore I like working for them. Maybe they are the type that doesn't watch over my shoulder or gives me the key to the place or lets me take stuff home to work on. My point is you be the judge where the hassle is worth the delay if this occurs. Most important spell out your terms on the invoice and send reminders every 30 days. Now home users are different, you should expect them to pay on the spot or very soon after. Just as the local PC shop expects them to pay before getting their stuff back you should too. Judgment comes into play here as well. Some will ask you to stop by for one thing and then keep you longer than expected. Do not bring a prepared invoice based on what you believe the charge will be. It's always better to tell them the cost and then e-mail them the invoice after you've been paid.


Tax Tips

If your serious about an on-going consulting business take the time to set yourself up properly. This will pay dividends in increased revenue and tax savings. Assign a room in your house as your home office. This will let you deduct any costs related to that office from your earnings as a consultant. There is no law your business has to make a profit so as long as you document the expenses you deduct you can do so even if the expenses exceed your earnings. The benefit here is that you get some tax savings from your regular paycheck from the loss of your business. Hopefully your business gets going and earns a profit but until this is the case you may as well do what you can to save money. A few examples of things you can deduct are the insurance, utilities, and internet costs proportional to the % of square feet your office consumes. I've even heard that technically you can deduct dog related expenses if you can prove the dog also guards the home office. I don't recommend stretching anything however.


Sales Tax

Get yourself established as a business in your state so you can charge sales tax. I know this sounds crazy but if your going to sell your time, why not sell the parts and mark them up 10% so you make more money. I used to require my clients to purchase items and then call me to install them. Now I just make sure I trust they'll pay and order the items myself. This lets me increase revenue and as long as you keep track of what you charged you simply pay the sales tax at the end of the year, couldn't be easier.

Hopefully those tips will help you start a small business on the side. From there you can grow or shrink the business as you see fit.
By: John Gall

Evaluating IT Security Options

Issues for IT Managers
Assessing IT security claims - especially in the all-important area of perimeter security - is difficult at the best of times. Vendors in this area often exaggerate their offerings (sometimes spectacularly!). It can be very difficult to compare products side-by-side. Compounding this problem is the lack of clarity in the terms that are used. It's almost impossible to make rational comparisons between the different offerings available.
This paper has been written for IT managers who aren't security specialists, but who want to make good decisions about IT security. This guide is about giving you the tools for evaluating your options and making good decisions. It's not about what the answers might be.

How Important Is IT Security?
It's often assumed (and always proclaimed by IT security vendors) that the correct answer is "extremely". Realistically, it isn't so. The truth of the matter is that it depends on what you're trying to protect. Just as banks spend more money on physical security than milk bars do, so the right level of IT security for your organisation depends on the value of what you're protecting.
To assess what constitutes a sensible level of protection, you need to consider three things:
  • The value of what you are protecting
  • The likelihood of you having a problem
  • The things you can't replace

So what are you protecting?

There are lots of things to consider here. The bulk of your physical IT assets aren't under threat: that's a question for physical security. But your confidential information and IP are, and the value of these varies dramatically from business to business. Consider, for example, the impact on your business if your accounts receivable data were altered, if your customer contact details were stolen, or if tender documents were accessed without your permission.
The cost of bandwidth, often consumed by hackers (and sometimes by staff!) in very large quantities, is commonly overlooked.
And consider also the cost of the business interruption caused by a compromise. Such costs can't be recovered: lost productivity is lost forever, as anyone who's had a virus infection knows.

How likely is it that it will happen to you?
Businesses often wonder why they would be a target? "I'm not a bank - who would worry about me?" is an understandable comment.
To an extent that's true: some businesses are much higher profile than others. But that analysis ignores the proliferation of automated hacking tools that can search for vulnerabilities across an entire country in a matter of hours. With access to such tools, hackers often no longer worry about targeting: they don't need to. They concern themselves only with who's vulnerable.
Finally, consider the possibility of industrial espionage. It's very unusual, but it does happen.
Like it or not, the Internet is still a lot like the Wild West, and there are some talented and unscrupulous guns for hire out there.

Replacing the irreplaceable
The third category of potential loss you should consider is the question of the irreplaceable. Some things, once lost, can't be replaced.
Lost productivity is one. A large accounting firm lost more than half a days's work for over 50,000 staff as a result of one virus attack.
Confidential information is another: once Pandora's box has been opened, it's too late.
And reputation, of course, is the big one. A good reputation can be very hard to develop, but it can be easy to tarnish.

Summary
There is no fixed answer. IT security may not be important to you, although the fact that you're reading this suggests that it probably is. How important is a question only you can answer.


What Makes Good IT Security?
Security is a huge field, and in this paper we are confining ourselves to discussing the first and most important area: perimeter security: the safeguards between your systems and the outside world.


So what makes good perimeter security?
First of all, as an absolute basic level of security, you'll need a firewall. Many vendors make a lot of fuss about firewalls, but the truth is that good firewalls are fairly common nowadays: there's no real rocket science in a good firewall.
It's broadly accepted that stateful packet filtering is better than stateless packet filtering. Application proxies offer an additional level of security as well, but in the end a run-of-the-mill firewall, well configured, will give better security than the world's best technology if it's not well set up and managed.

Won't my router do what a firewall will do?
This is a common myth, promoted largely by those who sell routers! If a router could do what a firewall can do, no-one would need firewalls.
The fact that many router manufacturers sell firewalls says a lot.
The truth of the matter is that routers can do a portion of what a firewall can do. But don't confuse the two. A router is designed for speed and efficiency, a firewall for security. It's like comparing a Ferrari with an Abrams tank. Either could replace the other in one sense: you could drive to work in the Abrams, and you could go to war in the Ferrari. But it wouldn't make much sense.
So is a firewall enough?
In most cases, no, firewalls are important but not sufficient. They don't, just by themselves, address issues like viruses or other inappropriate content. They will, quite correctly, let harmful traffic through to mail servers and web-servers without hesitation, because they make decisions based on the type of traffic only, not the content! So they don't deal with other important problems like SPAM, either.
They're a critically important starting point and foundation, but that's all they are.

In that case, what else will I need?
That depends on what else you're concerned about. The following list might help.
  • gateway anti-virus
  • VPN
  • SPAM-filtering
  • Content filtering/management
  • Bandwidth management and reporting systems, QIS/traffic shaping
  • Better general security and reporting of intrusion attempts - IDP (Intrusion Detection and Prevention systems)

Gateway Anti-Virus

Gateway anti-virus systems are systems that quarantine viruses at the Internet gateway, before they reach your trusted internal network.

Why would I need that? I have Anti-Virus on my desktops?
There are several good reasons for using gateway A/V in addition to desktop A/V.
Firstly, it's very hard to control whether desktop A/V systems are kept up to date because they're almost all dependent on the operation of the individual desktop or notebook machine, and users make changes to their machines. Even if you can lock them down, they're dependent on being frequently connected, and it's hard to control that.
The use of a good gateway A/V system doesn't take away the need to keep the desktop systems up to scratch, but it does mean that if something goes wrong on that front, the downside is a lot smaller.
Secondly, most desktop A/V systems are only updated at most daily. This really isn't anywhere near often enough. A good gateway A/V system should be kept scrupulously up to date, and it's a lot easier to keep one system up to date than dozens or hundreds of desktops and notebooks.

So how do I pick a good gateway A/V system?
Here's what you should look for:
  • Find out how often updates are made available. One of the main issues with A/V systems is how well they're kept up to date. Don't be satisfied with "you can download updates as often as you want", because the issue is how often the signature files are updated. Downloading every six hours won't help you if the signature files are only updated weekly.

  • Choose an A/V system for your gateway that's different from the desktop systems. Doing the same filtering twice is not as good as using two different systems.
  • Check out exactly what channels your gateway A/V system will scan for viruses. Ideally, it should be scanning email (SMTP, POP, and IMAP), FTP and HTTP as well.

  • Try to find an A/V system with heuristic detection as well as signature-based detection. Heuristic detection enables your system to trap new viruses before there are signatures written for them.


SPAM filtering

SPAM filtering is important to many organisations because of the tremendous drain on productivity that SPAM represents. But be aware of this: SPAM filtering cannot ever be 100% reliable. The reason? An email offering you a great deal on a new car will almost certainly be SPAM, unless it happens to come from a friend who knows you're in the market for a new car. No SPAM filtering system can tell the difference.
That said, good SPAM filtering systems can filter upwards of 90% of SPAM.

So how do I select a good SPAM filter?
Like anti-virus systems, the main issue to consider is how often they're updated. Like virus writers, senders of SPAM are constantly up to new tricks.
The other important issue is flexibility. It's important to ensure that your SPAM filtering solution give you the flexibility to adjust it to your needs.

VPN
VPN systems are probably the easiest to choose. IPSec is the standard, but many organisations are happy with the lower security of PPTP because it's easier to set up.
So how do I select a good VPN system?
There are three things to consider:
  • How fast does the product work? Your VPN needs to be able to process data fast enough that the VPN won't become a bottleneck.
  • What key lengths does the product support? In general, 1024-bit keys are considered a prudent minimum.
  • What are the options for client systems? If you're providing VPN facilities for roadwarriors, consider what client software options you have. This is where PPTP support becomes an issue for many organisations (but not for all).


Internet Content Filtering

There are several content filtering systems on the market that work well and have substantial market-share. The cold commercial fact is that commercial success is important here, since content filtering businesses are very expensive businesses to set up and run. A warning: content filtering systems aren't 100% effective: sites are being added at a phenomenal rate so it just isn't possible. That's why it's good to go with a substantial product in this area: the smaller players in general can't compete with the larger ones.
So how do I choose a content filtering system?
No real surprises here:
  • How often is the database updated? Most of these products use a database of URLs or keywords or exceptions to make their decisions about how to categories a given item. Like anti-virus and SPAM filtering, frequency of update is critical.
  • How much flexibility does the product give me? And how much do I need? Some organisations have filtering needs that vary from one group of users to another. If you do, you need to select a product that will support it. Furthermore, the default setup may not suit your organisation. Check that you have the ability to establish flexible rules.

Intrusion Detection

There are two approaches in the market.
The first is what might be termed the "traditional approach", more commonly called IDSs (Intrusion Detection Systems). These systems involve the comprehensive detection of (and reporting on) anomalous traffic. Typically, they generate huge amounts of data, most of which is of little value, because the fact that the traffic is anomalous does not mean that it is bad. It takes human expertise to work out what's worth acting on.
And in the hands of those with the right expertise, this information can be of tremendous value. That's the upside. The downside is that if you're compromised it will be able to tell you all about it, but it doesn't attempt to prevent it.
The more recent approach is called IDP (Intrusion Detection and Prevention). It's a different approach, and opinions are divided as to whether it's better or worse. It's less exhaustive: it doesn't even attempt to detect and log everything, but rather just the traffic that's undeniably bad. The advantage of this approach is that knowing that it has detected something bad, it can take action, and that's where the "Prevention" part comes in. IDP systems often make on-the-fly changes to security configuration to ensure that bad traffic never gets to your internal systems.

How do I choose an Intrusion Detection system?
The first thing to do, of course, is to decide which general approach you prefer. Then, regardless of your choice, the issues come down to many of the same questions:
  • How often is the IDS/IDP system updated? Like anti-virus and SPAM filtering, frequency of update is critical.
  • What happens when something bad is detected? Whether the system itself acts on it (in an IDP system), or whether your own experts will analyse the reports (in an IDS), there's no point having the system if you can't get a benefit from it.

Integration

Many organisations have been misled by the "best of breed" approach that's touted by some suppliers. The truth of the matter is that a "best of breed" solution is an ill-defined concept. Quality systems are undeniably important, but it's more important to ensure that your systems are well integrated and managed.
A "best-of-breed" firewall and a "best-of-breed" IDS won't deliver the value you expect unless they're properly integrated with each other and with your other security systems. For good security, integration is more important than many people initially think. Anti-virus needs to be integrated with mail servers. It should also work with HTTP proxies and caches. Content filtering should be integrated with email. Your reporting systems should be integrated with the lot!
The point is that the characteristics of an individual technology component aren't the be-all and end-all of security; the ability of the subsystems to work together where appropriate is critical.
 By: Simon Heron

Achieving Strategic Value in the Sale of an Information Technology Company

One of the most challenging aspects of selling an information technology company is coming up with a business valuation. Sometimes the valuations provided by the market (translation - a completed transaction) defy all logic. In other industry segments there are some pretty handy rules of thumb for valuation metrics. In one industry it may be 1 X Revenue, in another it could be 7.5 X EBITDA.

Since it is critical to our business to help our information technology clients maximize their business selling price, I have given this considerable thought. Why are some of these software company valuations so high? It is because of the profitability leverage of technology. A simple example is what is Microsoft's incremental cost to produce the next copy of Office Professional? It is probably $1.20 for three CD's and 80 cents for packaging. Let's say the license cost is $400. The gross margin is north of 99%. That does not happen in manufacturing or services or retail or most other industries.

One problem in selling a small technology company is that they do not have any of the brand name, distribution, or standards leverage that the big companies possess. So, on their own, they cannot create this profitability leverage. The acquiring company, however, does not want to compensate the small seller for the post acquisition results that are directly attributable to the buyer's market presence. This what we refer to as the valuation gap.

What we attempt to do is to help the buyer justify paying a much higher price than a pre-acquisition financial valuation of the target company. In other words, we want to get strategic value for our seller. Below are the factors that we use in our analysis:

  1. Cost for the buyer to write the code internally - Many years ago, Barry Boehm, in his book, Software Engineering Economics, developed a constructive cost model for projecting the programming costs for writing computer code. He called it the COCOMO model. It was quite detailed and complex, but I have boiled it down and simplified it for our purposes. We have the advantage of estimating the "projects" retrospectively because we already know the number of lines of code comprising our client's products. This information is designed to help us understand what it might cost the buyer to develop it internally so that he starts his own build versus buy analysis.

  2. Most acquirers could write the code themselves, but we suggest they analyze the cost of their time to market delay. Believe me, with first mover advantage from a competitor or, worse, customer defections, there is a very real cost of not having your product today. We were able to convince one buyer that they would be able to justify our seller's entire purchase price based on the number of client defections their acquisition would prevent.
  3. Another arrow in our valuation driving quiver for our sellers is we restate historical financials using the pricing power of the brand name acquirer. We had one client that was a small IT company that had developed a fine piece of software that compared favorably with a large, publicly traded company's solution. Our product had the same functionality, ease of use, and open systems platform, but there was one very important difference. The end-user customer's perception of risk was far greater with the little IT company that could be "out of business tomorrow." We were literally able to double the financial performance of our client on paper and present a compelling argument to the big company buyer that those economics would be immediately available to him post acquisition. It certainly was not GAP Accounting, but it was effective as a tool to drive transaction value.
  4. Financials are important so we have to acknowledge this aspect of buyer valuation as well. We generally like to build in a baseline value (before we start adding the strategic value components) of 2 X contractually recurring revenue during the current year. So, for example, if the company has monthly maintenance contracts of $100,000 times 12 months = $1.2 million X 2 = $2.4 million as a baseline company value component. Again, this financial analysis is to establish a baseline, before we pile on the strategic value components.
  5. Finally, we try to assign values for miscellaneous assets that the seller is providing to the buyer. Don't overlook the strategic value of Blue Chip Accounts. Those accounts become a platform for the buyer's entire product suite being sold post acquisition into an "installed account." It is far easier to sell add-on applications and products into an existing account than it is to open up that new account. These strategic accounts can have huge value to a buyer.

After reading this you may be saying to yourself, come on, this is a little far fetched. These components do have real value, but that value is open to a broad interpretation by the marketplace. We are attempting to assign metrics to a very subjective set of components. The buyers are smart, and experienced in the M&A process and quite frankly, they try to deflect these artistic approaches to driving up their financial outlay. The best leverage point we have is that those buyers know that we are presenting the same analysis to their competitors and they don't know which component or components of value that we have presented will resonate with their competition. In the final analysis, we are just trying to provide the buyers some reasonable explanation for their board of directors to justify paying 8 X revenues for an acquisition. A more detailed analysis is contained on the White Paper on our Web site.
Article Source: http://www.articlesontap.ws

This chapter highlights some of the history of IMS and describes how IMS fits into contemporary IT multitiered enterprise architectures.

IMS is IBM’s premier transaction and pre-relational database management system, virtually unsurpassed in database and transaction processing availability and speed. IMS clients have trusted IMS with their most critical business asset—their operational data—for decades.
Today’s IMS has only a superficial resemblance to the product that first shipped in 1969. However, an application program that ran on IMS in 1969 will still run today, unchanged, on the current release of IMS. From the beginning, IBM’s focus on the future of IMS has been unwavering.
IMS continues to be a strategic component of today’s enterprise computing environments. This chapter highlights some of the history of IMS and describes how IMS fits into contemporary IT multitiered enterprise architectures.
In This Chapter
  • IMS and the Apollo Program
  • IMS as a Database Management System
  • IMS as a Transaction Manager
  • Who Uses IMS?
  • IMS and Enterprise Integration

IMS and the Apollo Program

On May 25, 1961, United States President John F. Kennedy challenged American industry to send an American man to the moon and return him safely to earth, thus launching the Apollo program. North American Aviation, in partnership with IBM, fulfilled the requirement for an automated system to manage large bills of material for the construction of the spacecraft in 1965. In 1966, the IBM and North American Aviation teams were joined by three members from Caterpillar Tractor. Together, they designed and developed a system that was called Information Control System and Data Language/Interface (ICS/DL/I).
The IBM team completed and shipped the first release of ICS in Los Angeles in 1967, and in April 1968, ICS was installed. The first “READY” message was displayed on an IBM 2740 typewriter terminal at the Rockwell Space Division at NASA in Downey, California, on August 14, 1968. Less than a year later, on July 20, 1969, Apollo 11 landed on the moon’s surface. ICS was subsequently relaunched as Information Management System/360 (IMS/360) and made available to the IT world. In short order, IMS helped NASA fulfill President Kennedy’s dream and also became the foundation for the database management system (DBMS) business.
Much has changed since 1968; IMS continues to evolve to meet and exceed the data processing requirements demanded by today’s enterprise businesses and governments.

How IT Shapes Top-Down and Bottom-Up Decision Making

What determines whether decisions happen on the bottom, middle, or top rung of the corporate ladder? New research offers a surprising conclusion: The answer often lies in the technology that a company uses.
Information-based systems, such as Enterprise Resource Planning (ERP) software, will push decision-making toward the bottom of the corporate ladder. Communication systems, such as e-mail and instant messaging applications, will push the decision-making process toward the top.
And that means developing an IT strategy isn't all about deploying the best technology, says Raffaella Sadun, an assistant professor of strategy at Harvard Business School.
"If a CEO can trust his senior managers, he will be more willing to decentralize decision-making"
"The bottom line is that whoever is in charge of the acquisitions and the IT strategy, they obviously cannot just think about the technology side, they also have to think about the organizational side," she says. "Traditionally, technology is thought of as a tool that enables empowerment, but that's not always the case."
Sadun discusses the issue in "The Distinct Effects of Information Technology and Communication Technology on Firm Organization," a paper she cowrote with Nicholas Bloom of Stanford University and Luis Garicano and John Van Reenen of the Centre for Economic Performance, London School of Economics.
"Technologies that make the acquisition of information easier at the lower level of the hierarchy are associated with a decentralization of the decision-making process," Sadun says. "On the other hand, we have the communication technologies, which actually do exactly the opposite."

IT's different roles

Companies, however, often fail to consider the disparate roles of their software systems, let alone their effects on organizational behavior. Rather, they lump "information technology" into one amorphous idea—the "IT" department—which encompasses all the technology in the organization.
"Technology tends to be dumped into a single category," Sadun says. "The reality is that IT is a huge, heterogeneous set of technologies."
Similarly, when examining issues such as organization and productivity, industry and academic studies historically tend to treat information and communication technologies as "an aggregate homogeneous capital stock," according to the paper. To that end, Sadun and her fellow researchers set out to show how—and why—managers need to consider the very different organizational effects of communication and information technologies.
"This difference matters not just for firms' organization and productivity, but also in the labor market, as information access and communication technology changes can be expected to affect the wage distribution in opposite directions," their paper states.
The researchers looked at non-production decisions such as capital investment, new hires, and new product plans. Such decisions are either centralized near the top of the corporate ladder or decentralized and delegated to the top of a particular business unit. And the decision makers often depend on ERP software, which facilitates the dissemination of information throughout a large company, enabling detailed coordination among various operating units.
Next, they looked at production decisions, which involve figuring out the tasks necessary to meet the goals and deciding how to pace them. These decisions are generally the bailiwick of either a factory floor worker or a supervisor. For those cases, the researchers studied the role of Computer-Aided Design (CAD) and Computer-Aided Manufacturing (CAM) software in decision-making.
In both instances, the researchers hypothesized that the information software would lead to decentralized decision-making. Because the software eases access to the information necessary to make important choices, both the ERP and CAD systems would increase the likelihood that plant managers and production workers would make decisions and act on them without having to consult an executive at headquarters.
On the other hand, the team hypothesized that a rise in leased lines and corporate intranets would lead to a rise in centralized decision-making at the top of the corporate ladder.

Enabling micromanagement

In the past, communication often depended on faxes, overnight delivery services, "snail mail," or site visits. Even with phone calls, it was difficult for anyone at headquarters to make educated decisions and communicate them to branch offices. In those cases, it was natural to cede control of daily operations to a local manager.
With today's networking technologies, it's easier for top executives to keep a constant flow of communication with branch offices. However, the network may actually deter innovation. When technology makes it easier to communicate, erstwhile independent workers may find themselves pestering their bosses with e-mailed questions throughout the day. Micromanaging executives find themselves making all the decisions and constantly sending mandates down the corporate ladder.
"Whenever there is a reduction in the cost of transmitting information, it's easier for the person down in the hierarchy to communicate with the CEO," Sadun says. "And the CEO can monitor constantly what this person is doing and just give orders, rather than rely on the judgment of those below."
The research team evaluated data from some 1,000 manufacturing firms in eight countries, including detailed technology rollout histories and surveys that gauged the relative decisional autonomy of plant managers and floor workers. (In gauging the factors that determine whether a firm adopts any given technology, the researchers considered geographic variables that might affect the cost of acquiring the technology—the firm's distance from the Walldorf, Germany, headquarters of ERP market leader SAP, for instance, and the fact that telecom industry regulations vary from country to country, which means networking prices vary, too.)
The findings were consistently parallel with the hypotheses: An increase in the penetration of ERP systems led to a substantial increase in plant manager autonomy. A CAD/CAM deployment raised the likelihood of floor worker autonomy. But communication technologies served to lower autonomy, meaning more decisions happened at the corporate level.
"I was reassured and surprised at the same time that these results were holding across countries and industries," Sadun says.

The importance of trust

That said, Sadun notes that technology is hardly the only factor that determines whether a firm allows decision-making both up and down the corporate ladder. Another major factor lies in cultural differences across and within countries. In a separate study, Sadun found that otherwise similar companies showed huge differences in decision-making tactics, according to their geographical location. In the paper "The Organization of Firms across Countries," coauthored with Bloom and Van Reenen, she documents that firms located in areas with high levels of trust tend to be systematically more decentralized than those in areas with low levels of trust.
Sweden and Portugal, for example, seem to be on opposite ends of the trust spectrum. "There's huge cross country heterogeneity in the way even apparently similar firms decide how to allocate decision rights within the firm," Sadun says. "Take Swedish manufacturing companies, for example. You see that they are completely decentralized, and the middle manager is basically a mini-CEO with loads of decision-making power. And then you take a firm that produces exactly the same good, but instead of in Sweden, it's in Portugal. And there, the middle manager doesn't decide anything and is completely dependent on the authority of the CEO.
"In our research," she continues, "we argue that different levels of trust are a key determinant of these differences. If a CEO can trust his senior managers, he will be more willing to decentralize decision-making. For example, there might be a lower concern about the fact that managers will use their power to pursue their personal interests instead of those of the firm."

Information overload: Don't become techno slaves

THE introduction of computers, satellites, Internet and allied technology or gadgetry has created information that has far surpassed human processing ability.
Information-overload, or "data-smog", is an overwhelming occupational, social and emotional problem.

The impact of information-overload is particularly apparent at the workplace.

More and more people spend their time at work sorting through emails, voice messages and web pages.

Their day is interrupted by phone calls and more incoming emails and faxes, not to mention the papers they have to get through and face-to-face information to be elicited or disseminated.

In a study of a sample of Fortune 1,000 companies, employees averaged 178 core messages a day needing action and three interruptions for some kind of information every hour.

The blessings of information technology allow people at work to access information speedily, not only from 9am to 5pm but also from 5pm onwards until 9am the next day. And, succumbing to that temptation can be harmful, dangerous and destructive.

Our brains, and consequently our bodies, get tired because we remain "wired" for long hours almost until exhaustion, causing mental and physical "burn-out".

In fact, the vast amount of information to digest and process causes stress, anxiety, fatigue, frustration, reduced productivity, an inability to concentrate and a sense of being overburdened.

All of these in turn reduce performance efficiency and affect health and family life.

Home life is interrupted as family members mostly retreat into their private "techno-cocoons", calling, messaging, accessing information or playing games.

As a result, in many homes, we are seeing a major shift in the balance with techno-stress and information-overload causing almost total loss of interaction.

We must, however, be clear that information by itself is not the problem.

One needs it in adequate measure, promptly and of the right quality to succeed in whatever endeavour one is engaged in.

The real problem is when we come to a point where we feel we do not have all the information and that we are not well informed. We then search for more and try to stay on top of it. To our utter consternation, we often realise that we just cannot find all the information soon enough and even less so to digest it all and deal with it.



We should discipline ourselves by limiting the time we spend accessing information by cruising the Internet or sending messages.

It is always useful and a good practice, to the extent possible, to set aside time for people to call, communicate or meet with.

When there are interruptions, try responding on your own time. Don't reply or be available to everyone all the time. That is absolutely unproductive.

Relax when technology puts you on hold or for whatever reasons you need to wait for a response or to contact someone.

Don't be impatient and get agitated. Rather, use that time to rest your mind or think about other tasks, preferably smaller ones.

Use the technologies that work for you. Don't acquire every new technology when something you're used to works perfectly fine. Of course, if and when the time and circumstances demand that you change, then go for it and familiarise yourself fully to use it efficiently.

Set aside slots for daily family time, exercise, sports, social get-togethers with friends and family vacations.

As we enter the second decade of the 21st century, let's use the vast and readily available information to our benefit by focusing on what's essential, managing our time well, using technologies objectively and, above all, optimally balancing work, study, recreation and family life.

RUEBEN DUDLEYPetaling Jaya, Selangor

ECONOMY: SME Corp doing a good job

I refer to the report "Microsoft, SME Corp to transform firms" (NST, Dec 6).

I wish to commend SME Corporation Malaysia chief executive officer Datuk Hafsah Hashim and her team for their efforts to improve     services  to  small- and medium-scale enterprises to raise their level to be on a  par with international SMEs and to provide the best platforms and assistance to get them to compete on a global platform.

Microsoft Malaysia managing director Ananth Lazarus said Microsoft would  reinvest  up to RM20.8 million for the first year if the Office 365 software was adopted by 50,000 SMEs.

Being  dependent on the Internet and  information technology, we are sure that the above partnership would be  useful to us.

Our company is one of the beneficiaries of SME Corp's grant  four years ago and  its assistance has helped us gain a good footing  to sell our  products to the international markets, especially in the United States, Europe, China and Japan.

The grant scheme is now defunct and has been replaced by  other programmes, such as business loans  managed by the Malaysian Industrial Development Finance.  SME Corp  has always been pro-active towards improving SME agendas, and it has  introduced many new and interesting solutions and development programmes to help  SMEs to do better.

It always works towards the government's call to get Malaysian SMEs  do business globally rather than just providing services or selling their products in the local market.

They have put in place many SME development programmes, such as the One Referral Centre (http://www.smecorp.gov. my/v4/node/73), a one-stop  centre  for SMEs to get business advice  and information.

Acting as the central coordinating agency, one of SME Corp's functions is to provide and disseminate  information to  SMEs, as well as being a channel for feedback on SME issues.

Advisory services in SME Corp cover business matters as well as programmes available in SME Corp  and other  ministries, agencies, banks, development financial institutions and associations.

The above-mentioned development programmes are useful for all SMEs. 

For those who have  not  tried them, do give them a go.  There are  knowledgeable, helpful and friendly staff to assist you.

I understand there are about 15 ministries and 64 agencies set up by the government to assist SMEs.

I believe  there should be only one coordinating agency for us to deal with.

And if there should be one agency that can carry out such functions well, it would be SME Corp.

Malaysia face talent shortage in ICT-based industries

KUALA LUMPUR: The country is facing a shortage of talents versed in Information and Communications technology industries, Education Minister Datuk Dr Wee Ka Siong said.

He said a healthy pool of human capital in all industries is needed to ensure the country develops holistically.

“There are many job opportunities in every sector but if everyone wanted to be a doctor then we would have a shortage of skilled workers in other areas.”

“It is important for students to be realistic and aim for a career path which is suited to their capabilities,” he said at a press conference after witnessing the signing of a memorandum of understanding between Tunku Abdul Rahman College (TARC) and IBM (International Business Machines) Malaysia Sdn Bhd.

The MOU is for IBM to set up a centre of excellence (COE) at the college’s main campus here, aimed at enhancing the information technology (IT) curriculum at TARC to graduates and postgraduates.

Under the collaboration, TARC will provide teaching facilities while IBM will lend their leading software, courseware, training and testing tools for varsities, and the participants will be given a professional IBM certification.

TARC principal Dr Tan Chik Heok said the academia-industry collaboration was in sync with the college’s strategy to innovate and be responsive to market needs.

“Our collaboration with IBM will enable TARC to produce quality ICT graduates, who will contribute to the nation’s development in this sector,” he said.

He said there had been a significant dip in students enrolling into ICT courses over the past several years.

IBM Malaysia managing director Ramanathan Sathiamutty said the collaboration was an example of the company’s ongoing effort to bring innovation and progress to the nation.

“The country’s shortage of ICT talents becomes apparent when IBM struggles at times to recruit new staff.”

“ICT is an important sector as most businesses nowadays use technologies which require ICT experts to manage and service,” he added.

Average doesn't work now

IN an essay entitled, "Making it in America", in the latest issue of The Atlantic, author Adam Davidson relates a joke from cotton country about just how much a modern textile mill has been automated.
The average mill has only two employees today: "A man and a dog. The man is there to feed the dog, and the dog is there to keep the man away from the machines."

Davidson's article is one of a number of pieces that have recently appeared making the point that the reason America has such stubbornly high unemployment and sagging middle-class incomes today is largely because of the big drop in demand because of the Great Recession.

It is also because of the quantum advances in both globalisation and the information technology revolution, which are more rapidly than ever replacing labour with machines or foreign workers.
In the past, workers with average skills, doing an average job, could earn an average lifestyle. But, today, average is officially over.

Being average just won't earn you what it used to. It can't when so many more employers have so much more access to so much more above average cheap foreign labour, cheap robotics, cheap software, cheap automation and cheap genius.

Therefore, everyone needs to find their extra -- their unique value contribution that makes them stand out in whatever is their field of employment. Average is over.

Yes, new technology has been eating jobs forever, and always will. As they say, if horses could have voted, there never would have been cars.

But there's been an acceleration. As Davidson notes, "In the 10 years ending in 2009, (United States) factories shed workers so fast that they erased almost all the gains of the previous 70 years; roughly one out of every three manufacturing jobs -- about six million in total -- disappeared."

And you ain't seen nothin' yet. Last April, Annie Lowrey of Slate wrote about a start-up called "E la Carte" that is out to shrink the need for waiters and waitresses.

The company "has produced a kind of souped-up iPad that lets you order and pay right at your table. The brainchild of a bunch of Massachusetts Institute of Technology engineers, the nifty invention, known as the Presto, might be found at a restaurant near you soon. You select what you want to eat and add items to a cart".

"Depending on the restaurant's preferences, the console could show you nutritional information, ingredients lists and photographs. You can make special requests, like 'dressing on the side' or 'quintuple bacon'.

"When you're done, the order zings over to the kitchen, and the Presto tells you how long it will take for your food to come out.

"Bored with your dining companions? Play games on the machine. When you're through with your meal, you pay on the console, splitting the bill item by item if you wish and paying however you want. And you can have your receipt emailed to you.

"Each console goes for US$100 per month. If a restaurant serves meals eight hours a day, seven days a week, it works out to 42 cents per hour per table -- making the Presto cheaper than even the very cheapest waiter."

What the iPad won't do in an above average way a Chinese worker will. Consider Sunday's article in The New York Times by Charles Duhigg and Keith Bradsher about why Apple does so much of its manufacturing in China.

"Apple had redesigned the iPhone's screen at the last minute, forcing an assembly-line overhaul. New screens began arriving at the (Chinese) plant near midnight.

"A foreman immediately roused 8,000 workers inside the company's dormitories, according to the executive. Each employee was given a biscuit and a cup of tea, guided to a workstation and within half an hour started a 12-hour shift fitting glass screens into beveled frames.

"Within 96 hours, the plant was producing over 10,000 iPhones a day. 'The speed and flexibility is breathtaking,' the executive said. 'There's no American plant that can match that.'"

And automation is not just coming to manufacturing, explains Curtis Carlson, the chief executive of SRI International, a Silicon Valley idea lab that invented the iPhone programme known as Siri, the digital personal assistant.

"Siri is the beginning of a huge transformation in how we interact with banks, insurance companies, retail stores, healthcare providers, information retrieval services and product services."

There will always be change -- new jobs, new products, new services. But the one thing we know for sure is that with each advance in globalisation and the IT revolution, the best jobs will require workers to have better education to make themselves above average.

Here are the latest unemployment rates from the Bureau of Labor Statistics for Americans over 25: those with less than a high school diploma, 13.8 per cent; those with a high school diploma and no college, 8.7 per cent; those with some college or associate degree, 7.7 per cent; and those with a bachelor's degree or higher, 4.1 per cent.

In a world where average is officially over, there are many things we need to do to buttress employment, but nothing would be more important than passing some kind of GI Bill for the 21st century that ensures that every American has access to post-high school education.

3 rewarding PTPL diplomas in IT

FOR a rewarding career in the information sciences industry, enrol in PTPL's information technology diploma programmes.
The college offers three such courses, namely Diploma in Information Technology, Diploma in Graphic Design and Diploma in Multimedia Technology, all accredited by the Malaysia Qualification Agency (MQA).

The Diploma in Information Technology (DIT), which grooms future IT professionals, exposes students to computing and analytical skills.

The Diploma in Graphic Design (DGD) is designed to train students on the application of graphic design techniques used in the production of multimedia-related work, advertising and packaging.

Besides learning the effects of graphics on human psychology, students will have the opportunity to understand the principles in multimedia preparation. They will be trained on drawing techniques and the basics of photography.

The Diploma in Multimedia Technology (DMT) focuses on multimedia concepts, the electronic information industry, hypermedia and hypertexts. It also places emphasis on the application of multimedia in education, training, business and entertainment.

Among the subjects offered are Multimedia Design, Basics of 2D and 3D Graphics and animation, Advanced Multimedia Composition, Design of Digital Printing Material and Digital Audio/ Visual.

Diploma holders are eligible for transfer credits upon enrolling into the Management & Science University (MSU).

This privilege will ensure students a place in the second year of their degree courses at MSU.

Graduates may seek employment as multimedia software developers, multimedia software tool developers, multimedia technologists, programme analysts, multimedia software project managers, multimedia consultants, software entrepreneurs, academicians or researchers.

Students with credits in Bahasa Melayu, Mathematics and Science in SPM/SPMV are invited to join the programme.

Those with lower qualifications but wish to join the programme are encouraged to take up the Certificate in Computer Science before advancing to the diploma programme.

China-M'sia can work towards successful industrialisation

KUALA LUMPUR: China and Malaysia can cooperate towards successful industrialisation, specially for the small and medium enterprises (SMEs), says China's vice-minister of industry and information technology, Xi Guohua.
China, he said, is in the process of promoting industrialisation and information-based economy, to strengthen its competitiveness and set the foundation for building an affluent society.

Xi said he met with officers from the Ministry of International Trade and Industry (MITI) yesterday regarding China's industrial restructuring and improvements in scientific innovation.

He was in Kuala Lumpur to attend the 10th Asean Telecommunications and Information Technology Ministers' Meeting, which ended today. China is one of Asean's Dialogue Partners.

Speaking to reporters after the meeting, Xi said Malaysia had shown great interest in how, China promotes the future of industrialisation and information, in the middle and western parts of the country.

China and Malaysia, he added, can share experiences and methods as both countries are on the industrialisation path to economic growth.

"China and Malaysia are two countries which have a long-standing relationship. Malaysia is the largest trading partner of China, among all the Asean countries.

"Therefore, the discussion of cooperation in the industrial sector, especially for the SMEs, will exert great impact on our future development," Xi said.

He also highlighted that the China-Malaysia bilateral trade volume is expected to reach US$50 billion in 2010. The bilateral trade between countries in 2009 totalled US$36.34 billion.

According to Xi, MITI is also keen on cooperating with China to further strengthen the SME sector as its plays an important role in driving the economy of both countries.

"In China and Malaysia, the SMEs are contributing between 50-60 per cent of the gross domestic product (GDP).In China, 80 per cent of the employment opportunities are created by SMEs," he said.

Xi also invited Malaysian businessman to the China International SME Fair in Guangzhou and China-Asean Expo in Nanning, to explore business opportunities in the country as well as boost trade and investment.

On the 10th Asean Telecommunications and Information Technology Ministers' Meeting, Xi said information and communication technology (ICT), is an important driver of industrialisation.

"As high energy consumption will not sustain in the future, we have to employ high-tech applications, and ICT can play a role in transforming and upgrading the traditional industry," he noted.

ICT, he said, can help improve corporate governance and the sales system by connecting users with cutting-edge technologies while the internet will increase
operating efficiency.

"SMEs in China are lagging in this regard and the government has to do a lot to help them, for example, building a public service platform for them," he added.

Xi said ICT development, especially for the SMEs, is important for the economic development of not only China, but also the Asean member countries to benefit regional socio-economic development.

China, he said, is committed to cooperating with Asean to enhance ICT development by focusing on six areas. Among these are are human resources and broadband development.

Xi said the proposed Asean "Super Digital Corridor", similar to China's Information Super-Highway concept, will promote the socio-economic development of Asean-China, if the two ideas are combined. --BERNAMA