My Visitors

Tuesday, January 12, 2010

Data Management in Insurance Industry

While data storage and management is a critical issue for IT professionals across industries, its severity gets accentuated for the insurance industry. Here's a brief on various techniques to control it.
Data is all pervasive � it begins much earlier than the initial stages of client understanding and diligence, and extends far beyond helping revenue generation, encompassing cross and up-selling products or services. It also helps to understand the business risks and verify whether the regulatory compliance needs are met. The insurance industry depends on promises made on paper, which are eventually converted into supporting databases and document repositories. This article elaborates on the types of data, modes of data acquisition, data checks and usage, and the prevalent techniques for data management.
Insurance industry's data can broadly be classified as employee-related, distribution-related, customer-related, product-related, operations- related and accounting-related. Of these categories, employee-related data is required purely for internal workforce operations management and the rest have a direct impact on the cost and revenue of the insurance company.All data is collected and stored in databases, data warehouses and as documents or images.
Data management stagesManagement of data could be defined in three major stages: Data Acquisition, Data Quality Management, and Data Exploitation or Data Utilization. Let us look at these in detail:Data acquisition results from new business management, internal operations (HR, accounting, distribution and product & policy management systems). These are made available in unique respective data structures, in an integrated way. One step up, they can be consolidated into data warehouses and document management systems, jointly referred to as the universe of the insurance enterprise data.Data exploitation could be done to cater to different needs like planning or analyzing growth of revenue, cost control, improving efficiency of operations, planning and executing business expansions, conceptualizing new products, and to provide data-related services to customers, distribution networks and employees.Data Quality Management: Most of the big insurance enterprises have been operational for several decades and hence the data available with them may not be 100% accurate. Many such insurance enterprises still use green screens for systems support and policy administration. Data quality could be maintained and ensured, by continuously checking, correcting and preventing data errors, thereby making data ready for exploitation.The link between data acquisition, data quality management and data utilization could be described in the ICO (Input-Check-Output) model.Data management stagesManagement of data could be defined in three major stages: Data Acquisition, Data Quality Management, and Data Exploitation or Data Utilization. Let us look at these in detail:Data acquisition results from new business management, internal operations (HR, accounting, distribution and product & policy management systems). These are made available in unique respective data structures, in an integrated way. One step up, they can be consolidated into data warehouses and document management systems, jointly referred to as the universe of the insurance enterprise data.Data exploitation could be done to cater to different needs like planning or analyzing growth of revenue, cost control, improving efficiency of operations, planning and executing business expansions, conceptualizing new products, and to provide data-related services to customers, distribution networks and employees.Data Quality Management: Most of the big insurance enterprises have been operational for several decades and hence the data available with them may not be 100% accurate. Many such insurance enterprises still use green screens for systems support and policy administration. Data quality could be maintained and ensured, by continuously checking, correcting and preventing data errors, thereby making data ready for exploitation.The link between data acquisition, data quality management and data utilization could be described in the ICO (Input-Check-Output) model.Data management stagesManagement of data could be defined in three major stages: Data Acquisition, Data Quality Management, and Data Exploitation or Data Utilization. Let us look at these in detail:Data acquisition results from new business management, internal operations (HR, accounting, distribution and product & policy management systems). These are made available in unique respective data structures, in an integrated way. One step up, they can be consolidated into data warehouses and document management systems, jointly referred to as the universe of the insurance enterprise data.Data exploitation could be done to cater to different needs like planning or analyzing growth of revenue, cost control, improving efficiency of operations, planning and executing business expansions, conceptualizing new products, and to provide data-related services to customers, distribution networks and employees.Data Quality Management: Most of the big insurance enterprises have been operational for several decades and hence the data available with them may not be 100% accurate. Many such insurance enterprises still use green screens for systems support and policy administration. Data quality could be maintained and ensured, by continuously checking, correcting and preventing data errors, thereby making data ready for exploitation.The link between data acquisition, data quality management and data utilization could be described in the ICO (Input-Check-Output) model.Data Acquisition (Input)Structured data acquisition is critical to perform all subsequent data-related functions in an efficient and integrated manner. Data that is unstructured and not collected in databases is likely to create vacuums in data analysis. In today's insurance industry, data acquisition happens in five different broad segments:Customer data: Customer relationship management, customer self service portals, new business management systems and other customer touch point systems are the sources for acquiring this data. It comprises of customer's personal data such as family, contact, activities, complaints, service requests, financial, health, campaign offers, policies, loans and benefits info. This group of data is generally administered in CRM, customer portals, and IVRS.Distribution data: Distribution administration, sales & service management, compensation, compliance and other distribution touch point systems are the sources for acquiring this data. This group of data is generally administered in Distribution or Channel management systems, IVRS, FNA, quotation, applications and compliance management systems.Policy administration data: New business, underwriting management, claims, accounting and actuarial data are the sources for acquiring this data. It comprises of financial needs analysis, quotes, new business applications, cashier entries, lock/collection boxes, accounting, valuation, loss ratios, document images, turn around time, underwriting, claims and policy services info. This group of data is generally administered in legacy policy administration, claims, accounting and actuarial systems; however, there could be number of separate systems for underwriting, policy services and new business support systems.Product administration data: Product administration and pricing are the sources for acquiring this data. It comprises of product setup & management, profiling, pricing, profitability and product performance. A very few industries maintain market research data too. This group of data is generally administered in product management systems, actuarial systems, DWH and data marts.Employee data: It comprises of employee personal details such as contacts, activities, payroll, education qualifications, certifications, credentials, job history, training and development info. This group of data is generally administered in HRMS; however, in some cases there may be separate payroll and training & development systems.Missing, unstructured or disintegrated data acquired in any of the above five categories would create a gap in the data management chain and hence it is recommended to fill up these gaps diligently.
Data Quality Management (Check)Data acquired through various systems and databases needs to be checked for desired quality before being exploited. Data quality errors could result from inadequate verification of data stored in legacy systems, non-validated data leaks from the front end, inadequate integration, redundant data sources / stores, direct back-end updates, etc. In today's insurance Industry, data quality management is mostly ignored. Where implemented, it is done in one of the two ways described below.Unstructured approachMost enterprises rely on a few batch programs to check some portions of the data acquired, and most of the times, these programs are triggered by a serious problem identified in customer or financial data. Some enterprises schedule these batch runs and some still pursue to run only on demand. Such intermittent and unorganized batch runs can neither help to scale or integrate, nor make an impressive improvement to the overall data quality of the enterprise.Structured approachStructured data quality management, greatly contributes to scale up, integrate and thus create a big impact to the overall enterprise data quality. A structured data quality management model would pass through the following stages:Extraction of data from source and/or target systems.Run quality checks for identifying data transfer errors, data link/reference errors and domain integrity errors.Create a data quality mart to keep all the error records and all error-related details, to help in tracking and monitoring the aging of the problem and to do other analyses.Integrate the data quality errors into problem/incident trackers so that closures can be tracked.Provide online data quality error reports to the data owners along with its aging so that they can be fixed by them.The data volume, sensitivity/criticality and the data quality error exposure risks play a vital part in designing the right frequency to run, level of follow up and escalations settings, etc.The data quality errors are critical to be fixed & prevented in time so that businesses can stop revenue/opportunity losses, cut additional recovery expenses and build confidence of all stakeholders in the value chain. (There would be a separate paper discussing in detail on the evaluation of the existing data quality management along with gaps to help insurance industries to implement a proper data quality management system.)Data Exploitation (Output)Data acquired and checked thoroughly, is ready for exploitation. Data exploitation is the key stage which, if properly done, will help to reap the benefits of efficient data management. In other words, this is the value generation stage - which includes revenue growth, cost savings, operational efficiency gains, risk controls, etc, which are very critical for any business. This stage is also viewed as the information management stage. In Insurance industry today, the data exploitation which is the Output stage of the data management, is done in one of the two ways described below:Legacy approachMost enterprises extract data or information required on an ad hoc basis from their operational systems and use their applications or batch programs to generate some reports to help in decision making. This method is not sustainable when the demand grows or multi-dimensional needs come up or when data becomes voluminous. Moreover, data users need to trail behind a big Q number which might render it too late to initiate desirable action on an issue for which data was originally extracted.Structured approachWith the advantages of structured information management already reinforced in the last couple of paragraphs, an enterprise would be easily able to adapt to any volume or time challenges, thus creating a big impact to the overall information needs that are critical to the functioning and growth of the enterprise. Structured information management implementation can be done as laid down below:Enterprise Data Ware House (EDWH): Most enterprise data, which is called universe, needs to be extracted, loaded and transformed for information needs, and then segmented for summaries and details.Data Marts: Specific business functions (for example � accounting, compliance, etc) can have their data marts to address the key business problems in their functions.Reporting Needs: Detail lists and structured (authored and custom) reports can be published from DWH, data marts and operational data stores.Analysis Needs: Summaries need to be done with appropriate dimensions and measures to enable multi-dimensional analysis from DWH and data marts.Information management should be viewed from the perspective of enterprise needs that would cover all functions of the enterprise that would minimally or majorly impact the business. All functions of the enterprise can be seamlessly integrated through suitable enterprise information management systems.Frequency of refreshing the EDWH and data marts, extent of data integration, efficiency summaries depend on the business need or pace; hence, they need to be worked out during the design stage. The data needs to be exploited by creating data marts, reports and analysis to bring value to the enterprise.ConclusionIt is recommended that Insurance industries do a stock check of their data management implementation at all three stages: data acquisition, data quality management and data exploitation. The value of data management should be clearly understood and structured approaches need to be adopted at all stages. With these implemented, an enterprise can make informed decisions, refrain from information starving, remain highly integrated and scalable, and most importantly, stay ahead of competition.

Health insurance awareness in India is still extremely low

Howard Bolnick, who taught insurance at the Kellogg School of Management for 13 years,(RELATED ARTICLESInsurers gear up to launch health-life combo covers.Insurers gear up to launch health-life combo covers High-end car buyers give insurers a reason to smile186 Ulips that conform to fee cap get Irda nod. Insurers may disclose agent’s fee in papers)was in India for a five-day executive development programme on healthcare financing and health insurance, organised by the Insurance Foundation of India (IFI). Bolnick, a past president of the US Society of Actuaries, took time off to talk to Sarbajeet K Sen on how he viewed the Indian health insurance sector. Excerpts:n How do you see the present stage of the Indian health insurance market?It is very much in a formative stage still. I know there have been health insurance products roughly for 20 years but still the penetration level is barely 2 per cent of the population. This is very, very low. It is somewhat stagnant and listening to the people here, it is very clear in some sense it is not functional. I think insurance companies in India have things to learn; that they must do better. There is a generally bad but improving relationship between the health providers and the insurance companies. There is a low level of awareness of health insurance among the population. It is really not the kind of situation that is ready for takeoff.
However, given that the regulatory framework is improving and that many new players including foreign companies are willing to come in, I think things are at a takeoff stage.n Can the penetration levels of health insurance see a dramatic change in India going forward?You got a situation where there is 2 per cent market penetration. You will never get to 100 per cent. There are around 300-350 million people whom you could potentially sell health insurance to. I would argue to the health providers that if they cooperate with health insurance companies, the penetration can be as much as 150-200 million, which by the way will be growing. I am told there are some companies that take six months to pay claims. That’s absurd. How can you have a relationship with health providers if you are sitting on their money for so long?There are other things that from a US perspective might look perverse. I understand, not surprisingly, when an individual goes into a hospital here, one of the first questions asked is if the person is insured or not.That happens in the US too. But in the US context, you get a better deal than if you are not insured, but here it is a worse deal. Should that continue? No. At some point of time, there needs to be cooperation between the insurance industry and the health providers to set up a foundation to do things that makes sense to both of them.n Of late, there has been some debate on whether one should retain the commission structure in the insurance sector. What are your views given your US experience?My understanding here in India is that most of the market is retail. You have to reach them not through a place of employment but you have to reach them through an intermediary. In that sense, in the retail market, intermediaries are vital. In the US, agents are generally compensated by one of two ways, depending on the situation. Sometimes it is consultancy, sometimes it’s a flat fee. But in the retail market, it is a mainly percentage of the premium. It is a balancing act but where it will end up in India, I don’t know.n The government here is proposing to reduce the paid-up capital requirement of standalone health insurance companies to Rs 50 crore from the present Rs 100 crore. How do you see this move?I have heard that. Let’s put it this way. You really have to have capital in this industry, particularly when you are a nascent stage of development. I keep hearing about loss ratios of 140-150 per cent in India, which is very high. This really brings in the need of capital. What happens if you run out of capital? It is not good for the industry and not good for the policyholder if the company goes bankrupt. But bankruptcies and such types of disruptions will not help in the growth of health insurance in general. So, if I were a regulator, I would want to be careful to make sure that there is adequate capital.n What is the average loss ratio of health insurers in the US?Companies are making money. Health insurance companies in the US have been one of the real darlings of Wall Street.n Is it because they don’t pay claims?That’s bad. I have heard people say that. It is a political debate. There are certainly instances of it. But it is generally not true. The reason that they are so profitable is because they develop managed care techniques and have learned how to manage their business much better than what we knew as an industry 20 years ago. They have been able to reap nice profits.n There is a proposal to raise the foreign direct investment (FDI) cap in insurance to 49 per cent from the present 26 per cent. What are your views?Whether foreign partners should be allowed to own a majority or not is a local issue. But it seems to me that somehow the government and the regulator should create an environment that is encouraging for the right foreign companies to come in. Now, whether that is 49 per cent, 35 per cent or 55 per cent, I don’t know. But it should be a welcoming environment, an open environment, because a lot of foreign companies have knowledge, expertise that will help the health industry to grow.While data storage and management is a critical issue for IT professionals across industries, its severity gets accentuated for the insurance industry. Here's a brief on various techniques to control it.
Data is all pervasive it begins much earlier than the initial stages of client understanding and diligence, and extends far beyond helping revenue generation, encompassing cross and up-selling products or services. It also helps to understand the business risks and verify whether the regulatory compliance needs are met. The insurance industry depends on promises made on paper, which are eventually converted into supporting databases and document repositories. This article elaborates on the types of data, modes of data acquisition, data checks and usage, and the prevalent techniques for data management.

Savings Association Insurance Fund

The SAIF ended 1998 with a fund balance of $9.8 billion, a 5.0 percent increase over the year-end 1997 balance of $9.4 billion. Estimated insured deposits increased by 2.8 percent in 1998. During the year, the reserve ratio of the SAIF grew from 1.36 percent of insured deposits to 1.39 percent.For both semiannual assessment periods of 1998, the Board retained the rate schedule in effect for 1997, a range of 0 to 27 cents annually per $100 of assessable deposits. Under this schedule, the percentage of SAIF-member institutions that paid no assessments increased from 90.9 percent in the first semiannual assessment period to 91.9 percent in the second half of the year, as more institutions qualified for the lowest-risk assessment rate category. This rate schedule resulted in an average 1998 SAIF rate of 0.21 cents per $100 of assessable deposits.The SAIF earned $15 million in assessment income in 1998, compared to $563 million in interest income. In 1998, the SAIF had operating expenses of $85 million and net income of $467 million, compared to operating expenses of $72 million and net income of $480 million in 1997. For the second consecutive year, no SAIF-member institution failed in 1998.Under the Deposit Insurance Funds Act of 1996, the FDIC must set aside all SAIF funds above the statutorily required Designated Reserve Ratio (DRR) of 1.25 percent of insured deposits in a Special Reserve on January 1,1999. No assessment credits, refunds or other payments can be made from the Special Reserve unless the SAIF reserve ratio falls below 50 percent of the DRR and is expected to remain below 50 percent for the following four quarters. Effective January 1,1999, the Special Reserve was funded with $978 million, reducing the SAIF unrestricted fund balance to $8.9 billion and the SAIF reserve ratio to 1.25 percent.The SAIF Special Reserve was mandated by Congress in the Deposit Insurance Funds Act. It was not proposed in order to address any deposit-insurance issues. However, by eliminating any cushion above the DRR, the creation of the Special Reserve on January 1, 1999, increases the likelihood of the SAIF dropping below the DRR. This, in turn, increases the possibility that the FDIC would be required to raise SAIF assessment rates sooner or higher than BIF assessment rates, resulting in an assessment rate disparity between the SAIF and the BIF. In 1998, legislation that would have eliminated the Special Reserve was introduced in the Congress but did not pass. With banks experiencing another highly profitable year and only three bank failures, 1998 was another positive year for the BIF, despite adverse trends in the global economic picture. The BIF has grown steadily from a negative fund balance of $7 billion at year-end 1991 to $29.6 billion at year-end 1998. The 1998 fund balance represents a 4.7 percent increase over the 1997 balance of $28.3 billion. BIF-insured deposits grew by 4.1 percent in 1998, yielding a reserve ratio of 1.38 percent of insured deposits at year-end 1998, unchanged from year-end 1997.Deposit insurance assessment rates in 1998 were unchanged from 1997. For both semiannual assessment periods in 1998, the Board voted to retain rates ranging from 0 to 27 cents annually per $100 of assessable deposits. Under these rates, 95.1 percent of BIF-member institutions, or 8,808 institutions, were in the lowest-risk assessment rate category and paid no deposit-insurance assessments for the second semiannual assessment period of 1998. This rate schedule resulted in an average 1998 BIF rate of 0.08 cents per $100 of assessable deposits.

Monday, January 11, 2010

From the moment we begin to grasp

From the moment we begin to grasp just how much God loves us, a deep desire wells up within us to love Him in return.Suppose as you were walking in a dense forest, you strayed off the path and lost your way. Night falls and it becomes dark and cold. You have no food or water. Danger lurks from forest animals. You frantically stumble ahead in the darkness, with no sense of direction. Fear gnaws at you: "I may never make it back to safety."Just as all hope is fading, your frantic eyes see a dim light in the distance. Suddenly you realize, "Someone is searching for me!" The light grows brighter. It's coming your way. You call out, "Here I am!" The reply comes back: "Keep calling!" Moments later your rescuer appears in the unmistakable uniform of a forest ranger -- one who knows the deep woods and the way home. By the dim light, you study his kind and fatherly face. Steadily he leads you back. As the ranger delivers you to your doorstep, he says, "You're safe now." In sheer relief you respond in the only way you can -- with profound gratitude. "How can I ever repay you?" you ask, knowing no payment could ever suffice.In the very same way our heavenly Father has rescued us. Our condition was more desperate than we could ever know. We couldn't make it on our own. We were groping frantically in complete darkness when He came and personally guided us home. He redeemed us from deadly peril.Perhaps the Apostle John had such a sense of gratitude when he declared, "We love Him because He first loved us" (1 John 4:19).It would not be unusual for you to be harboring distorted images of what your heavenly Father is like. I've had many false images myself -- at different times a "higher power" or a stern judge to be feared. But once I saw His unconditional love for me, a love unaffected by who I was and all I had done, my heart was able to say, "Father, I love you." Amazingly, God not only loves us -- He actually created us so we can love Him in return.How deeply should we love God? Jesus, who knew the Father intimately, said, "Love the Lord your God with all your heart, with all your soul, with all your mind, and with all your strength" (Mark 12:30). All our faculties come into play. "All" leaves nothing out. Loving the Father to that degree continues to stretch me, for it means giving all that I am to the One who gave His all for me.My heart swells -- and I believe so does God's -- when I spontaneously lavish my love upon Him, expressing my affection and gratitude to Him.How am I doing on Day 4? Can you confidently make the following statements?I am certain that by faith I have been born again -- I've come out of darkness into light. (Remember, our assurance is based on the fact of God's promise, not on how we feel.)My focus is on the here and now, not where I've been, and not on what the future holds.I have entered into a new and personal relationship with Jesus Christ. He loves me and He is changing me from the inside out.We'll be coming back to these themes from time to time. Now, I want to introduce you to a book -- the Bible. It must become your friend and lifelong companion.Words are important in any culture. They are how we communicate facts, ideas, instruction, encouragement and correction. It shouldn't surprise you that God will communicate with you by words. He even referred to Jesus, His Son, as "the Word" (See John 1).The Bible consists of 66 books -- a collection of history, poetry, letters and the accounts of those from whom we can learn valuable lessons. It is the number one means by which you can learn who God is, how He wants you to live and how He will help you.I didn't always view the Bible as this kind of companion. Before I committed my life to Jesus, I found it confusing, even mysterious. Part of the problem was that I tried to read it like any other book, from page 1. Soon, I got bogged down and put it aside.After I became a believer, the Bible began to come alive. I started reading one of the four gospels in the New Testament on the advice of a friend and discovered a wonderful account of the life of Jesus. I found it very real, very transparent. I became aware that what I was reading was producing hope in me. I discovered lessons that applied to where I was right then. In fact, often what I would read each morning would apply directly to events happening that very day. More than once I recall saying, "That's amazing!" If you have a Bible, great! If not, you can access one online at www.biblegateway.com. Try starting with Luke's gospel. Read a little each day. As you read, let the words speak to you, bringing some fresh insight, some new truth. Savor what God tells you through His word -- turning it over in your mind, allowing it to be a vital source of life. (Even though something is confusing, remember everything in the Bible is there for a purpose.)Another good practice would be to look up the Bible references I mention in this study. This will help you become familiar with "navigating" through the Bible and to see the specific context for a particular verse or verses. (In this 30-day study, I've generally used the New King James Version of the Bible, unless otherwise noted -- such as "NIV" for the New International Version -- but feel free to find a version that is appropriate for you.)As you know, in any kind of building project, the foundation is critical. For example, in the construction of a new high-rise building, the "unglamorous" work below ground must be done carefully. The foundation is no place to cut corners, even though this phase of construction can seem to take forever. Eventually, activity begins above ground and the building takes shape with remarkable speed. In the same way, your new life in Christ must be built on a solid foundation. The very best foundation is the Bible. Situations change, and friends come and go. But the Bible is "rock solid." Decide today to build your foundation on God's word, making it your lifelong companion.We all know life can be harsh. Consequently, it may be difficult to look at our circumstances and conclude that God is a god of love. Perhaps you grew up in a broken home with little evidence of love -- possibly even abuse -- from your earthly father. You may have lost loved ones to illness, accident or war. In some areas of the world, poverty and famine are daily realities. Where is God's love in all this?I believe God's heart aches more than we can ever know at the pain, suffering, injustice and difficulties all His children encounter, especially when you understand that most of it is the result of mankind turning away from Him. Sin's entrance into the world brought severe consequences. Yet from the onset of sin and its ravages, God had the remedy in mind.That remedy was Jesus. While we were still caught up in sin, God sent His own Son to buy us back -- to "redeem" us. Here is the way Jesus describes the love of God:God so loved the world that He gave his one and only Son, that whoever believes in Him shall not perish but have eternal life (John 3:16 NIV).The Apostle John also focused on God's love: God is love. This is how God showed His love among us: He sent His one and only Son into the world that we might live through Him (1 John 4:8,9 NIV).Oswald Chambers sums up the linkage between the cross and God's love:The bedrock of our Christian faith is the unmerited, fathomless marvel of the love of God exhibited on the Cross of Calvary, a love we never can and never shall merit (March 7).Billy Graham, in his recent book, The Journey, says, "The more I read the Bible, the more I realize that love is God's supreme attribute" (p. 22).When you yielded your life to God and were born again, you came face to face with your heavenly Father's love. Now, as you continue your Christian journey, lean into that love, drawing deeply upon it. Immerse yourself in His love and care.Here is a prayer, penned to the early church in Ephesus by the Apostle Paul, who wrote several letters to new believers that are recorded in the New Testament. You can make this prayer your own: "(that you may) know the love of Christ which passes knowledge; that you may be filled with all the fullness of God

Flat World Economics

First, let me say that I really enjoy my subscription to Web Pro New; and, in particular your articles. I delete a lot of 'stuff' in my inbox, but I really look forward to your 'stuff.'I'm really interested in finding whether there is any direct correlation between organic search and PPC. While my empirical evidence is a bit weak, at least I'll still continue to pursue it. My affiliate site accounts were recently banned by Google. I may make a hat saying that! lol It is a kind of privalege in a way... if you are the eternal optimist I tend to be.I believe they were justified in banning me. A lot of my first time attempts were really, really poor and I believe their "look-back" on my history really did it. That's fine! I wasn't even doing PPC with them at the time of my ban (within the past two weeks).I was concentrating on building content to my site and SEO. I was using Alexa as my measurement tool. I was consistently bringing down my number (from the high 700K to almost below 400K) today. I have the tools and methods in place to continue this trend. Obviously, I'm doing this to bring organic traffic to my site.The Google cops stress value, original content, unique content, positive experience... and all that. The only way you get it is to build upon your focus, keep it consistent, and build some more. I'm at the point that they are now sending me traffic. Not a lot, but it's free! Well at least free in respect to the work I've done.I would think that they would re-evaluate my site, but that's not likely and I'm not going through that pain. Still, I have to wonder what my quality scores would be with them now if I just started to do PPC. The advice I would provide anyone is this. Do not invest in PPC with Google until you have plenty of original content. Just my opinion... 20-30 pages and make sure they are properly SEO'd.I only use WordPress now. They rock! Their plugins are simply fantastic and make it much easier to SEO your site. Do not let me inadvertently mislead you. I'm not saying this is easy and you can make a ton of bad mistakes. You have learn, learn, learn some more and get advice from people you trust.But, I'm of the belief that if you build enough original content on your site and properly SEO it, you place yourself in a great position to succeed. Wait until you start seeing some free or organic traffic on your site. (WordPress has plugins that will enable you to do this). Once you hit this level, then begin PPC. My bet is that you will be more than happy with your results. My money is on a correlation between organinc search and PPC with Google.But, this is just my opinion.When we make any kind of significant change in our lives, the first 30 days are critical. Studies show it takes this long to shed an old habit, or establish a new one.I mention this now to encourage you to press on with this study. Let the "habit" of studying God's word, His character and His ways become well rooted. In the remaining three weeks we'll consider several more topics that are essential to forming a firm spiritual foundation and walking out the new life in Christ. These 30 days are the first installment in your adventure of a lifetime.The reality is that you, as a new believer, can be pulled off course in a number of ways. Let's look at two major challenges and how you might respond.The pull of family and friends who don't understand. In coming to Christ you've made a radical choice, the implications of which are just hitting you. Regardless of how much or how little you've said to others, they are bound to notice early indicators of your changed life. Some will be interested to know more -- an opportunity to share your faith. Others will be antagonistic. They may mock you or whisper behind your back. It's important to realize it's not you, but Christ in you, that has them agitated. There is power in Christ, and that power draws a reaction from others.The best course is not to try to explain yourself or be defensive. Instead let Christ, Who now lives in you, love them through you. In time they may change, or possibly go their own way. But whatever you do, don't let others pull you back into your old ways.A sense of shame you feel about things that aren't right in your life. Maybe you're in an illicit or unhealthy relationship, or abusing your body with harmful drugs. You may have long-standing issues with excessive use of alcohol or any number of other habits that aren't easily shaken. You may be saying, "I can't make it in the new life," or, "I'm not good enough," or, "There's no way I can change who I am." Please -- don't give in to these thoughts. If you stay the course, the day will come when, with the Lord's help, you will be able to break free from the habits that pull you down. Take your concerns to Jesus. Talk with Him -- friend to friend. After all, He knows all about you, and He took you to Himself just as you were. He loves you just as you are. Change is working from the inside out.Press through for the next several weeks. I'm confident you will start seeing some amazing victories, both in yourself and even in the people and circumstances around you.Key ScriptureHe who is in you is greater than he who is in the world (1 John 4:4).Key ThoughtI will hold closely to the Lord today, and not worry about tomorrow.Questions or comments about this study? Click "reply" if you received this by email and let us know your thoughts. Statistics indicate that over one billion people visit the Internet each day. Of these, at least a million are searching for answers to the deeper questions of life: Why am I here? What is my purpose? What is God like? A phrase that is frequently used by those exploring such deeper issues is "the meaning of life." Gratefully, the Bible helps us understand that life's meaning is rooted in a unique friendship. Billy Graham says, "It is the greatest discovery you will ever make: You were created to know God and be His friend forever" (The Journey, p. 23).A key to understanding how you can be God's friend is to know you were created with more than a body, mind and soul. You were created with a spirit. Your human spirit was awakened when you were born again, enabling you to communicate directly with God and have fellowship with Him. The Apostle Paul says we "received the spirit of adoption by whom we cry out, ‘Abba, Father'" (Romans 8:15). "Abba" literally means "Daddy."The heart of every person deeply longs for true friendship. Think about your best friends and how important those relationships are. Yet human friendships can and do fail. This became very real to me recently when the twenty-one year old son of a close friend tragically took his own life. The reason? The young man's closest friend, a classmate he had known for several years, stated firmly he no longer wished to continue the friendship. The loss was so devastating to my friend's son that life lost its meaning to him, and he completely gave up.Even though friends on this earth may come and go, in God you have a friend Who will be there forever. Graham says further: "This is a staggering truth. Think of it. The infinite, all-powerful holy God of the universe wants to be your friend! He wants you to know him personally. He wants you to know He is with you. He wants to comfort you when you are upset or anxious. He wants to guide you when you face difficult decisions, and He even wants to correct you when you are about to do something foolish or wrong" (p. 31).Take a moment now to let this profound reality sink in. You were created to be God's friend. This reflects the enormous heart of your wonderful Lord! What a privilege to be one of His children, to be able to come to Him any time, day or night, to be completely open with Him, to share your most intimate thoughts and fears.

Sunday, January 10, 2010

One of the major barriers to test automation

One of the major barriers to test automation is the volatility of the application to be tested. Even benign changes such as moving a button to a different part of the screen or changing a label from 'Next' to 'Continue' can cause test scripts to fail. This problem is especially acute when testing an application's functionally at the GUI level because this tends to be the area in which changes are most frequent.For this reason, test teams have historically tended to avoid functional test automation until an application has become stable. However, this approach does not work when building SaaS products or developing products in an Agile environment where change is a constant. In this situation, a more sophisticated test approach is required to avoid incurring 'technical debt' (i.e., quality issues or tasks that are deferred to be addressed later or not at all).Develop a Reusable Subroutine LibraryThe first level of sophistication in test automation is to develop a library of reusable subroutines that will encapsulate and hide an application's implementation details from the test scripts themselves. For example, you might implement a test subroutine called 'login_to_my_app()' that accepts two parameters: 'UserName' and 'Password'. At runtime, this test subroutine will first find the appropriate text fields in the application GUI and then execute the keystrokes and mouse movements required to fill in the given user name and password, thus completing the login operation. If the application's login GUI changes (e.g., if the on-screen label for 'User Name' is changed to 'Login Name'), then the 'login_to_my_app()' test function might also need to be updated. However, the scripts that called this subroutine would not need to change.The reason this type of abstraction is a win for some projects is that, in this scenario, a change to the application now only requires you to maintain a single subroutine rather than potentially hundreds or thousands of test scripts. Since complex applications frequently have thousands of test scripts, the savings in effort and time is clear. The trade-off is that the level of skill required to maintain a subroutine library is generally higher and of a different type than that needed to maintain test scripts. Obviously this requirement will change your staffing profile and cost structure.However, the benefit of this approach is that you can accommodate change much more readily. More importantly, it will allow you to safely write test cases before the application has been fully implemented. Test script writers only need to know (a) the software's functionality and requirements and (b) the 'stubs' of the test subroutines that will be implemented. You can significantly compress your implementation schedule by enabling your development and test automation processes to function in parallel.Use a Domain-Specific Language ApproachThe next level of sophistication in scripted test development is referred to as 'keyword' or 'abstract' test automation. In this approach, words from the problem domain, not the implementation, are used to describe test scenarios in what is sometimes referred to as a 'domain-specific language' (i.e., DSL). Translator or interpreter software is then used to map the abstract keywords that make up the script into the keystrokes and mouse movements that are required to drive the application under testing.Sometimes this process utilizes verb and noun analysis of the use cases or user stories to produce a suitable set of keywords or abstractions to describe test scenarios - even before any code is written. This functional test automation approach is the most Agile because test cases can be written first and thus enable true test-driven development, even at the functional level.Let us look at a test script for a (simplified) ATM / Cash Machine as an example. Using a simple self-explanatory syntax for this illustration, a fragment of one test script might look something like this:VERIFY ACCOUNT_BALANCE = 10000 INR MAKE_WITHDRAWAL 1000 INR VERIFY ACCOUNT_BALANCE = 9000 INRIn this example, 'VERIFY' and 'MAKE_WITHDRAWAL' would be verbs in our scripting language, and 'ACCOUNT_BALANCE' would be a noun. Note that the terms used to describe the test scenario are taken from the problem domain and make no reference whatsoever to the implementation. All knowledge of the implementation is completely hidden in the translator software. In many cases, this means that the translator must be quite sophisticated. For example, it may need to know the page structure of the application and then navigate from page to page in order to execute the required functionality.Developing or customizing a complex translator can be a larger investment up-front, but the long-term benefits are significant. And although translator software is typically custom-built for a specific application, general-purpose programmable translators are beginning to appear more frequently. Regardless of its origin, the advantages of a sophisticated translator include the following:1. Test scripts that are written entirely in the problem domain are robust in the face of nearly any change to the implementation since all possible implementations must solve the same domain-specific problem. When you have thousands or even tens-of-thousands of scripts accumulated over the course of years, the scripts themselves become your major investment. Developing the translator is a relatively minor cost compared to this investment.2. As mentioned earlier, you can easily write scripts before implementing the software, thus enabling a true test-driven development approach. Although new features will sometimes require you to extend the scripting language, this process is generally easy to perform since it is taken directly from the domain.3. Non-technical domain experts can easily author test scripts because the concepts come directly from the problem domain. With a good architecture, the translator itself can be made robust in the face of changes to the application under testing. This can be done by (a) using a state table to describe page navigation, (b) taking advantage of the object structure of the UI, or (c) using a data-driven approach.Using XML as a scripting language enhances the language's extensibility and can provide a guided script-driven approach with syntax driven by the XML Schema itself. XML is also easy to parse using off-the-shelf tools.ConclusionWithout a doubt, developing a library of reusable subroutines and a domain-specific language to test your system requires an up-front investment. However, in situations where you (a) need to build or customize a SaaS product, (b) wish to truly gain the benefits of an Agile, test-driven development methodology; (c) need to compress your test and development schedule as much as possible while still ensuring high quality; or (d) must cope with constant change during pre- or post-deployment, taking a more sophisticated approach to test automation can be a lifesaver for your project.

There is even an outside concern that one of the fringe members of the EU could , requiring a bailout in the same vein as the lifeline grudgingly bein


“The bias for risk-seeking is still in vogue.” This has nothing to do with the Euro, but rather is a roundabout way of speaking about the Dollar carry trade, which is responsible for an exodus of capital from the US, some of have which has no doubt found its way into Europe. In some ways, then, it’s almost pointless to scrutinize EU economic indicators too closely.That being said, there are a few meaningful observations that can be made. The first is that the EU economy is tentatively in recovery mode. Some of the most such as the German IFO index, capacity utilization, and Economic Sentiment Indicator, have all ticked up in the last month, while the unemployment rate is holding steady. For better or worse, this improvement can attributed entirely to export growth, due to the recovery in world trade. which means that the Euro Zone has officially exited the recession.The second observation is that many expect this exit to be short-lived. Due to the relative rigidity of the EU economy, specifically regarding the labor market, it may take additional time to get back on really solid footing. Thus, the “thinks that euro-area unemployment will continue to rise next year, reaching 10.9% in 2011. That will dampen consumer spending. Another worry is investment, which the commission thinks will fall by 17.9% this year. Businesses are unlikely to waste scarce cash on new equipment and offices when they have spare capacity. Firms confident enough to splash out may find it hard to secure the necessary financing from fragile and risk-averse banks.” The Commission also expects public finances to continue to deteriorate, perhaps bottoming at some point next year. There is even an outside concern that one of the fringe members of the EU could , requiring a bailout in the same vein as the lifeline grudgingly being thrown to Dubai by the UAE.Finally, there is the European Central Bank. Much like the Fed – and every other Central Bank in the industrialized world, except for Australia – the ECB is nowhere near ready to hike rates. “The overall economic context doesn’t suggest that they would want to tighten anytime soon. There is a feeling that, yes, things have improved, but that nonetheless, the outlook is still quite fragile,” summarized one economist. Sure, the ECB is winding down its liquidity programs, but so is the Fed. Based on long-term bond yields, investors believe that US rates could even eclipse EU rates at some point in the future.In short, there isn’t really much to be optimistic about, when it comes to the Euro. The nascent recovery is hardly remarkable, and probably not even sustainable. While the Euro might continue to perform the Euro in the short-term for technical reasons, I would expect this edge to evaporate in the medium-term.summarized the Euro’s ascent by noting, “The bias for risk-seeking is still in vogue.” This has nothing to do with the Euro, but rather is a roundabout way of speaking about the Dollar carry trade, which is responsible for an exodus of capital from the US, some of have which has no doubt found its way into Europe. In some ways, then, it’s almost pointless to scrutinize EU economic indicators too closely.That being said, there are a few meaningful observations that can be made. The first is that the EU economy is tentatively in recovery mode. Some of the most such as the German IFO index, capacity utilization, and Economic Sentiment Indicator, have all ticked up in the last month, while the unemployment rate is holding steady. For better or worse, this improvement can attributed entirely to export growth, due to the recovery in world trade. which means that the Euro Zone has officially exited the recession.The second observation is that many expect this exit to be short-lived. Due to the relative rigidity of the EU economy, specifically regarding the labor market, it may take additional time to get back on really solid footing. Thus, the “thinks that euro-area unemployment will continue to rise next year, reaching 10.9% in 2011. That will dampen consumer spending. Another worry is investment, which the commission thinks will fall by 17.9% this year. Businesses are unlikely to waste scarce cash on new equipment and offices when they have spare capacity. Firms confident enough to splash out may find it hard to secure the necessary financing from fragile and risk-averse banks.” The Commission also expects public finances to continue to deteriorate, perhaps bottoming at some point next year. There is even an outside concern that one of the fringe members of the EU could , requiring a bailout in the same vein as the lifeline grudgingly being thrown to Dubai by the UAE.Finally, there is the European Central Bank. Much like the Fed – and every other Central Bank in the industrialized world, except for Australia – the ECB is nowhere near ready to hike rates. “The overall economic context doesn’t suggest that they would want to tighten anytime soon. There is a feeling that, yes, things have improved, but that nonetheless, the outlook is still quite fragile,” summarized one economist. Sure, the ECB is winding down its liquidity programs, but so is the Fed. Based on long-term bond yields, investors believe that US rates could even eclipse EU rates at some point in the future.In short, there isn’t really much to be optimistic about, when it comes to the Euro. The nascent recovery is hardly remarkable, and probably not even sustainable. While the Euro might continue to perform the Euro in the short-term for technical reasons, I would expect this edge to evaporate in tsummarized the Euro’s ascent by noting, “The bias for risk-seeking is still in vogue.” This has nothing to do with the Euro, but rather is a roundabout way of speaking about the Dollar carry trade, which is responsible for an exodus of capital from the US, some of have which has no doubt found its way into Europe. In some ways, then, it’s almost pointless to scrutinize EU economic indicators too closely.That being said, there are a few meaningful observations that can be made. The first is that the EU economy is tentatively in recovery mode. Some of the most such as the German IFO index, capacity utilization, and Economic Sentiment Indicator, have all ticked up in the last month, while the unemployment rate is holding steady. For better or worse, this improvement can attributed entirely to export growth, due to the recovery in world trade, which means that the Euro Zone has officially exited the recession.The second observation is that many expect this exit to be short-lived. Due to the relative rigidity of the EU economy, specifically regarding the labor market, it may take additional time to get back on really solid footing. Thus, the “thinks that euro-area unemployment will continue to rise next year, reaching 10.9% in 2011. That will dampen consumer spending. Another worry is investment, which the commission thinks will fall by 17.9% this year. Businesses are unlikely to waste scarce cash on new equipment and offices when they have spare capacity. Firms confident enough to splash out may find it hard to secure the necessary financing from fragile and risk-averse banks.” The Commission also expects public finances to continue to deteriorate, perhaps bottoming at some point next year. There is even an outside concern that one of the fringe members of the EU could , requiring a bailout in the same vein as the lifeline grudgingly being thrown to Dubai by the UAE.Finally, there is the European Central Bank. Much like the Fed – and every other Central Bank in the industrialized world, except for Australia – the ECB is nowhere near ready to hike rates. “The overall economic context doesn’t suggest that they would want to tighten anytime soon. There is a feeling that, yes, things have improved, but that nonetheless, the outlook is still quite fragile,” summarized one economist. Sure, the ECB is winding down its liquidity programs, but so is the Fed. Based on long-term bond yields, investors believe that US rates could even eclipse EU rates at some point in the future.In short, there isn’t really much to be optimistic about, when it comes to the Euro. The nascent recovery is hardly remarkable, and probably not even sustainable. While the Euro might continue summarized the Euro’s ascent by noting, “The bias for risk-seeking is still in vogue.” This has nothing to do with the Euro, but rather is a roundabout way of speaking about the Dollar carry trade, which is responsible for an exodus of capital from the US, some of have which has no doubt found its way into Europe. In some ways, then, it’s almost pointless to scrutinize EU economic indicators too closely.That being said, there are a few meaningful observations that can be made. The first is that the EU economy is tentatively in recovery mode. Some of the most such as the German IFO index, capacity utilization, and Economic Sentiment Indicator, have all ticked up in the last month, while the unemployment rate is holding steady. For better or worse, this improvement can attributed entirely to export growth, due to the recovery in world trade. , which means that the Euro Zone has officially exited the recession.The second observation is that many expect this exit to be short-lived. Due to the relative rigidity of the EU economy, specifically regarding the labor market, it may take additional time to get back on really solid footing. Thus, the “thinks that euro-area unemployment will continue to rise next year, reaching 10.9% in 2011. That will dampen consumer spending. Another worry is investment, which the commission thinks will fall by 17.9% this year. Businesses are unlikely to waste scarce cash on new equipment and offices when they have spare capacity. Firms confident enough to splash out may find it hard to secure the necessary financing from fragile and risk-averse banks.” The Commission also expects public finances to continue to deteriorate, perhaps bottoming at some point next year. There is even an outside concern that one of the fringe members of the EU could , requiring a bailout in the same vein as the lifeline grudgingly being thrown to Dubai by the UAE.Finally, there is the European Central Bank. Much like the Fed – and every other Central Bank in the industrialized world, except for Australia – the ECB is nowhere near ready to hike rates. “The overall economic context doesn’t suggest that they would want to tighten anytime soon. There is a feeling that, yes, things have improved, but that nonetheless, the outlook is still quite fragile,” summarized one economist. Sure, the ECB is winding down its liquidity programs, but so is the Fed. Based on long-term bond yields, investors believe that US rates could even eclipse EU rates at some point in the future.In short, there isn’t really much to be optimistic about, when it comes to the Euro. The nascent recovery is hardly remarkable, and probably not even sustainable. While the Euro might continue to perform the Euro in the short-term for technical reasons, I would expect this edge to evaporate in the medium-term.to perform the Eusummarized the Euro’s ascent by noting, “The bias for risk-seeking is still in vogue.” This has nothing to do with the Euro, but rather is a roundabout way of speaking about the Dollar carry trade, which is responsible for an exodus of capital from the US, some of have which has no doubt found its way into Europe. In some ways, then, it’s almost pointless to scrutinize EU economic indicators too closely.That being said, there are a few meaningful observations that can be made. The first is that the EU economy is tentatively in recovery mode. Some of the most such as the German IFO index, capacity utilization, and Economic Sentiment Indicator, have all ticked up in the last month, while the unemployment rate is holding steady. For better or worse, this improvement can attributed entirely to export growth, due to the recovery in world trade. , which means that the Euro Zone has officially exited the recession.The second observation is that many expect this exit to be short-lived. Due to the relative rigidity of the EU economy, specifically regarding the labor market, it may take additional time to get back on really solid footing. Thus, the “thinks that euro-area unemployment will continue to rise next year, reaching 10.9% in 2011. That will dampen consumer spending. Another worry is investment, which the commission thinks will fall by 17.9% this year. Businesses are unlikely to waste scarce cash on new equipment and offices when they have spare capacity. Firms confident enough to splash out may find it hard to secure the necessary financing from fragile and risk-averse banks.” The Commission also expects public finances to continue to deteriorate, perhaps bottoming at some point next year. There is even an outside concern that one of the fringe members of the EU could , requiring a bailout in the same vein as the lifeline grudgingly being thrown to Dubai by the UAE.Finally, there is the European Central Bank. Much like the Fed – and every other Central Bank in the industrialized world, except for Australia – the ECB is nowhere near ready to hike rates. “The overall economic context doesn’t suggest that they would want to tighten anytime soon. There is a feeling that, yes, things have improved, but that nonetheless, the outlook is still quite fragile,” summarized one economist. Sure, the ECB is winding down its liquidity programs, but so is the Fed. Based on long-term bond yields, investors believe that US rates could even eclipse EU rates at some point in the future.In short, there isn’t really much to be optimistic about, when it comes to the Euro. The nascent recovery is hardly remarkable, and probably not even sustainable. While the Euro might continue to perform the Euro in the short-term for technical reasons, I would expect this edge to evaporate in the medium-term.summarized the Euro’s ascent by noting, “The bias for risk-seeking is still in vogue.” This has nothing to do with the Euro, but rather is a roundabout way of speaking about the Dollar carry trade, which is responsible for an exodus of capital from the US, some of have which has no doubt found its way into Europe. In some ways, then, it’s almost pointless to scrutinize EU economic indicators too closely.That being said, there are a few meaningful observations that can be made. The first is that the EU economy is tentatively in recovery mode. Some of the most such as the German IFO index, capacity utilization, and Economic Sentiment Indicator, have all ticked up in the last month, while the unemployment rate is holding steady. For better or worse, this improvement can attributed entirely to export growth, due to the recovery in world trade. , which means that the Euro Zone has officially exited the recession.

Friday, January 8, 2010

The Euro is showing strength


The Euro is showing strength in early European session and, after bouncing from 1.4975 ahead of the European session opening, to bounce up breaking above 1.5040 session high and reaching levels around 1.5070 at the moment of writing.Next resistance levels, at this point, lie at at 1.5085 (Nov 30 high) and above here, 1.5100 and 1.5140/45 Nov 26/25 high). On the downside, below 1.5040, support levels lie at at 1.4970 (session low), and below here, 1.4920/25 and 1.4870.According to Nicole Elliott, senior technical analyst at Mizuho Corporate Bank, the Euro could re-test year highs over next sessions: "We see this as a potential interim low might be in place and we shall now re-test this year’s high at 1.5145. Record futures volume on Friday adds weight to this view."The dollar relinquished its previous session’s strength against the majors, falling toward the 1.4850-level against the euro and the 1.6602-handle versus the British pound. The US equity bourses rebounded with the Dow Jones advancing by 1.65%, the S&P 500 advancing by 1.84% and the Nasdaq up by nearly 1.7% by the afternoon session. Crude oil also climbed back above the $80-per barrel mark. Optimism over the outlook for the US economy was reinforced by reports released earlier in the morning – prompting traders to jump back into riskier assets and sending the euro higher. The advanced reading of Q3 GDP sharply reversed the previous quarter’s decline of 0.7% and beat consensus estimates for an increase of 3.3%, instead surging by 3.5%. The advanced Q3 sales component of GDP advanced by 2.5%, compared with 0.7% from Q2 while the headline PCE index increased by 2.8% from 1.4% previously. Weekly jobless claims fell by less than expected, marginally lower to 530k from 531k a week earlier.The calendar for Friday consists of September personal income, personal consumption, PCE, core PCE, NAPM, Chicago PMI and the University of Michigan consumer sentiment survey. Following the disappointing Conference Board’s consumer confidence earlier this week, traders will focus on the University of Michigan consumer sentiment survey, expected to decline to 70.0 in October from 73.5 a month earlier. The expectations component is forecasted to decline to 69.0 from 73.5, while the current conditions index is seen slipping to 72.1 from 73.4.The dollar sold off sharply across the board in the Wednesday session despite a dearth of US economic data earlier in the morning. The greenback plunged to a fresh 14-month low against the euro past the psychologically key 1.50-level to 1.5040, a new 15-month low versus the Swiss franc at 1.0038 and 14-month low against the Australian dollar at 0.9326. A shift into riskier assets continues to be detrimental for the US dollars as traders price in improving conditions in the global economy. Crude oil prices climbed higher today, rallying above the $81 per barrel level by afternoon trading.The Fed’s Beige Book provided an optimistic assessment of the US economy, saying conditions have stabilized or improved modestly in many sectors since its last report. The Fed said that reports of gains in economic activity outnumber the declines, though the improvements are small and scattered. However, it tempered its assessment by saying adding that labor markets are typically characterized as weak or mixed, albeit with pockets of improvement. The economic calendar for Thursday will see weekly jobless claims, August home prices and the September leading economic indicators index. Weekly jobless claims are estimated to edge up slightly to 515k from 514k in the previous week. Meanwhile, the leading economic indicators index is forecasted to improve to 0.80% from 0.60% in August.

Tuesday, January 5, 2010

MASSACHUSETTS AG BLASTS STATE’S MANAGED COMPETITION; OCABR AND INSURERS DISAGREE

Massachusetts Attorney General (AG) Martha Coakley has released a report entitled Automobile Insurance: The Road Ahead, giving her take on the impact insurance deregulation has had on Massachusetts drivers. Prior to deregulation or “managed competition” begun on April 1, 2008, the AG’s office and the Division of Insurance reviewed the expenses and claims’ experience that insurers were required to submit to them and then set insurance premiums “consistently lower than that proposed by the industry – billions of dollars lower over the past twenty years.” In addition, the AG and insurance commissioner limited premium variations across territories and classes, capped charges on urban drivers, considered only variables such as the insured’s vehicle, driving behavior and garaging location, and required insurers to insure all drivers. Since managed competition, according to the Coakley report, insurers are no longer required to disclose their data; the rate ceiling has been eliminated, and caps on urban rates are being phased out; insurers reject drivers who are then randomly assigned to insurers in the residual market; and insurers consider other factors besides driving records, including prior coverage limits, payment history and the purchase of homeowners insurance. As a result, AG Coakley says, “While prices have dropped overall, consumers are paying more than they would have had the market not been deregulated.” While more insurers have entered the market, “most of the new entrants have not offered lower rates overall [and] … new insurers have not caused incumbent carriers to lower statewide prices,” Coakley said. According to the Coakley report, insurers raised their base rates by 10% at the beginning of managed competition, creating “excessive rates in an environment where insurer losses have, on average, decreased over the past several years.” Coakley speculates that Hispanics, low income consumers, the elderly and urban drivers “may” be paying increased prices and that consumers whose rates have decreased paid more than they should have. The AG accuses insurers of omitting data and information in their public filings, including key rating information, and she charges both insurers and the Automobile Insurers Bureau with “refus[ing] to make public data on claims, premiums and expenses necessary to determine whether statewide rates are fair and not excessive.” The Coakley report concludes that “the current experiment in deregulation has thus far not met its goal. Instead, managed competition has caused many drivers to be overcharged and has led to fewer consumer protections.” In light of her findings and responsibility, Coakley said, “The Attorney General’s Office intends to promulgate consumer protection regulations under her G.L. Chapter 93A Consumer Protection regulatory authority.” The Massachusetts Office of Consumer Affairs and Business Regulation (OCABR), which oversees the Division of Insurance, refuted Coakley’s report saying that since “managed competition” began, eleven more insurers have entered the Massachusetts market increasing competition and reducing rates. OCABR Undersecretary Barbara Anthony said, “Rates have decreased 8.2% on average and that’s a fact. About $270 million in premiums have been saved by consumers.” Two years ago, nineteen insurers wrote auto policies in the state. Currently, thirty insurers compete for coverage, led by Commerce Group (31%), Safety Group (11.1%), Arbella Insurance Group (9.3%), Liberty Mutual (8.5%) and MetLife Auto and Home (6.5%). Liberty Mutual Group Chairman, President and CEO Edmund Kelly called the Coakley report “flawed” and said “To better meet increased consumer demand under managed competition, we lowered our prices, added new products and improved service across the state. As a result, we have thousands of new customers and over 10% growth since ‘managed competition’ began.” Kelly said that Liberty Mutual is so committed to the new, more competitive insurance landscape in Massachusetts that it is adding 300 jobs at its Springfield, MA operations, further boosting the economic outlook for Massachusetts consumers. Consumers, he added, don’t want the government making decisions for them; “they want to choose for themselves the company they do business with – based on the quality of the product, service and price.”

Saturday, January 2, 2010

The law of civil procedure governs

The law of civil procedure governs process in all judicial proceedings involving lawsuits between private parties. Traditional common law pleading was replaced by code pleading in most states by the turn of the 20th century, and was subsequently replaced again in most states by modern notice pleading. The old English division between common law and equity courts was abolished in the federal courts by the adoption of the Federal Rules of Civil Procedure in 1938 and has also abolished in nearly all states. The Delaware Court of Chancery is the most prominent of the small number of remaining equity courts.A slight majority of states have adopted rules of civil procedure closely modeled after the FRCP (including rule numbers). However, in doing so, they had to make some modifications to account for the fact that state courts have broad general jurisdiction while federal courts have relatively limited jurisdiction.New York and California are the most significant states that have not adopted the FRCP. Furthermore, both states continue to maintain their civil procedure laws in the form of codified statutes enacted by the state legislature, as opposed to court rules promulgated by the state supreme court, on the ground that the latter are undemocratic. But certain key portions of their civil procedure laws have been modified by their legislatures to bring them closer to federal civil procedure.[45]Generally, American civil procedure has several notable features, including extensive pretrial discovery, heavy reliance on live testimony obtained at deposition or elicited in front of a jury, and aggressive pretrial "law and motion" practice designed to result in a pretrial disposition (that is, summary judgment) or a settlement. U.S. courts pioneered the concept of the opt-out class action, by which the burden falls on class members to notify the court that they do not wish to be bound by the judgment, as opposed to opt-in class actions, where class members must join into the class. Another unique feature is the so-called American Rule under which parties generally bear their own attorneys' fees (as opposed to the English Rule of "loser pays"), though American legislators and courts have carved out numerous exceptions.The law of civil procedure governs process in all judicial proceedings involving lawsuits between private parties. Traditional common law pleading was replaced by code pleading in most states by the turn of the 20th century, and was subsequently replaced again in most states by modern notice pleading. The old English division between common law and equity courts was abolished in the federal courts by the adoption of the Federal Rules of Civil Procedure in 1938 and has also abolished in nearly all states. The Delaware Court of Chancery is the most prominent of the small number of remaining equity courts.A slight majority of states have adopted rules of civil procedure closely modeled after the FRCP (including rule numbers). However, in doing so, they had to make some modifications to account for the fact that state courts have broad general jurisdiction while federal courts have relatively limited jurisdiction.

Wikipedia's greatest strengths

Wikipedia's greatest strengths, weaknesses, and differences all arise because it is open to anyone, it has a large contributor base, and its articles are written by consensus, according to editorial guidelines and policies.Wikipedia is open to a large contributor base, drawing a large number of editors from diverse backgrounds. This allows Wikipedia to significantly reduce regional and cultural bias found in many other publications, and makes it very difficult for any group to censor and impose bias. A large, diverse editor base also provides access and breadth on subject matter that is otherwise inaccessible or little documented. A large number of editors contributing at any moment also means that Wikipedia can produce encyclopedic articles and resources covering newsworthy events within hours or days of their occurrence. It also means that like any publication, Wikipedia may reflect the cultural, age, socio-economic, and other biases of its contributors. There is no systematic process to make sure that "obviously important" topics are written about, so Wikipedia may contain unexpected oversights and omissions. While most articles may be altered by anyone, in practice editing will be performed by a certain demographic (younger rather than older, male rather than female, rich enough to afford a computer rather than poor, et cetera) and may, therefore, show some bias. Some topics may not be covered well, while others may be covered in great depth.Allowing anyone to edit Wikipedia means that it is more easily vandalized or susceptible to unchecked information, which requires removal. See Wikipedia:Administrator intervention against vandalism. While blatant vandalism is usually easily spotted and rapidly corrected, Wikipedia is more subject to subtle viewpoint promotion than a typical reference work. However, bias that would be unchallenged in a traditional reference work is likely to be ultimately challenged or considered on Wikipedia. While Wikipedia articles generally attain a good standard after editing, it is important to note that fledgling articles and those monitored less well may be susceptible to vandalism and insertion of false information. Wikipedia's radical openness also means that any given article may be, at any given moment, in a bad state, such as in the middle of a large edit, or a controversial rewrite. Many contributors do not yet comply fully with key policies, or may add information without citable sources. Wikipedia's open approach tremendously increases the chances that any particular factual error or misleading statement will be relatively promptly corrected. Numerous editors at any given time are monitoring recent changes and edits to articles on their watchlist.Wikipedia is written by open and transparent consensus – an approach that has its pros and cons. Censorship or imposing "official" points of view is extremely difficult to achieve and usually fails after a time. Eventually for most articles, all notable views become fairly described and a neutral point of view reached. In reality, the process of reaching consensus may be long and drawn-out, with articles fluid or changeable for a long time while they find their "neutral approach" that all sides can agree on. Reaching neutrality is occasionally made harder by extreme-viewpoint contributors. Wikipedia operates a full editorial dispute resolution process, one that allows time for discussion and resolution in depth, but one that also permits disagreements to last for months before poor-quality or biased edits are removed.That said, articles and subject areas sometimes suffer from significant omissions, and while misinformation and vandalism are usually corrected quickly, this does not always happen. (See for example this incident in which a person inserted a fake biography linking a prominent journalist to the Kennedy assassinations and Soviet Russia as a joke on a co-worker which went undetected for four months, saying afterwards he "didn’t know Wikipedia was used as a serious reference tool.") Therefore, a common conclusion is that it is a valuable resource and provides a good reference point on its subjects.The MediaWiki software that runs Wikipedia retains a history of all edits and changes, thus information added to Wikipedia never "vanishes". Discussion pages are an important resource on contentious topics. Therefore, serious researchers can often find a wide range of vigorously or thoughtfully advocated viewpoints not present in the consensus article. Like any source, information should be checked. A 2005 editorial by a BBC technology writer comments that these debates are probably symptomatic of new cultural learnings that are happening across all sources of information (including search engines and the media), namely "a better sense of how to evaluate information sources."

American International Group

American International Group Inc. executives’ refusal to repay bonuses as previously promised is “outrageous,” and President Barack Obama’s compensation overseer should be given the power to get the money back, Senator Charles Schumer said.“This is outrageous,” Schumer said in an interview to be aired on “Political Capital with Al Hunt” this weekend. “They were supposed to return them. They said they would return them. Now many of them are not with the company.”Schumer, a New York Democrat, said he’d like Kenneth Feinberg, Obama’s special master on executive pay, to try to recover the money and he’d give him legislative or regulatory authority if it doesn’t already exist.“He would be very much disposed to getting the money back,” Schumer said.AIG, which received a $182.3 billion U.S. government bailout, ignited a backlash after giving about $165 million in March to employees of its AIG Financial Products unit, which has been blamed for pushing the company to the brink of collapse.Employees of the division pledged to return $45 million in bonus payments after New York Attorney General Andrew Cuomo threatened to release their names to the public. As of October, the employees had returned only $19 million, according to Neil Barofsky, the special inspector general overseeing the government bailouts.Neither Treasury spokeswoman Meg Reilly nor AIG spokesman Mark Herr would comment.Pay Cuts OrderedFeinberg was appointed by Obama to monitor compensation of the highest-paid executives at companies that received extraordinary help from U.S. taxpayers, including AIG. Feinberg has ordered pay cuts averaging 50 percent for the top 25 executives, and set a $500,000 cash salary limit for the next 75 workers at AIG and other companies.In the wake of the bonus revelations, some Senate Democrats proposed imposing a 70 percent tax on all companies getting U.S. bailout money.Last year, lawmakers approved a $700 billion bailout of U.S. banks, insurance companies and automakers after the bankruptcy of Lehman Brothers Holdings Inc. led to a freeze in credit markets and brought several financial firms to the brink of collapse. AIG, which wrote billions in credit default swaps in Lehman, was rescued by the Federal Reserve and the Treasury. The Wikipedia community is largely self-organising, so that anyone may build a reputation as a competent editor and become involved in any role he/she may choose, subject to peer approval. Individuals often will choose to become involved in specialised tasks, such as reviewing articles at others' request, watching current edits for vandalism, watching newly created articles for quality control purposes, or similar roles. Editors who find that editorial administrator responsibility would benefit their ability to help the community may ask their peers in the community for agreement to undertake such roles; a structure which enforces meritocracy and communal standards of editorship and conduct. At present, around a 75–80% approval rating after enquiry is considered the requirement for such a role, a standard which tends to ensure a high level of experience, trust, and familiarity across a broad front of aspects within Wikipedia.

U.S. retailers used

U.S. retailers used extra promotions and extended hours to draw procrastinators and shoppers delayed by the East Coast snowstorm in the final stretch before Christmas.Target Corp. extended its hours to midnight Dec. 21 through yesterday. Borders Group Inc., Wal-Mart Stores Inc. and Toys “R” Us Inc. also kept stores open longer. Best Buy Co. offered some DVDs for half off and Jos. A. Bank Clothiers Inc., a men’s clothing chain, deepened discounts to at least 50 percent.“We didn’t intend to do everything, and now we’re doing everything,” Jos. A. Bank Chief Executive Officer Neal Black, 54, said Dec. 22 by telephone from the company’s Hampstead, Maryland, headquarters. “We’ll be slugging right down to the last minute.”Sales will be compressed into the final days before Christmas, said Marshal Cohen, chief industry analyst at NPD Group Inc. The snowstorm disrupted the Saturday before Dec. 25. Last year, that was the second-biggest shopping day after Black Friday, the day after U.S. Thanksgiving. Shoppers already had procrastinated more than in recent seasons.“Retailers will pull out all the stops this week,” Cohen said in a Dec. 21 Bloomberg Television interview. NPD is a Port Washington, New York-based market research firm.Maintaining ForecastsThe Washington-based National Retail Federation was holding to its forecast for a 1 percent drop in holiday sales, Ellen Davis, a spokeswoman, said Dec. 20. The International Council of Shopping Centers reiterated on Dec. 22 its forecast for a 2 percent increase in sales at stores open at least a year in December, after reporting that the storm slowed growth to 0.4 percent year over year in the week ended Dec. 19.Jos. A. Bank cut prices of all clothing Dec. 21 and Dec. 22, after store visits slowed, Black said. The chain had planned to offer some of that merchandise at 40 percent and 30 percent off, he said.The retailer’s shares advanced 77 cents to $42.99 on the Nasdaq Stock Market yesterday. Target, based in Minneapolis, increased 6 cents to $48.85 in New York Stock Exchange composite trading. Borders, based in Ann Arbor, Michigan, rose 8 cents to $1.25. Bentonville, Arkansas-based Walmart declined 2 cents to $53.32. Best Buy, based in Richfield, Minnesota, added 31 cents to $40.76.Kathryn Greenberg, a 41-year-old Washington resident who works in philanthropy, said she lucked into some “fantastic” late discounts yesterday. She bought clothing for her children and other family members mostly at 60 percent off at a Gap store as well as one of Gap Inc.’s Banana Republic stores.Bigger Savings, More Buying“I am spending the same as last year, but getting more,” said Greenberg, who was carrying two bags and heading into Sephora, the cosmetics chain owned by Paris-based LVMH Moet Hennessy Louis Vuitton SA.Walmart, the world’s largest retailer, will keep most of its 803 discount stores and its Sam’s Clubs open until 8 p.m. today, two hours later than last year, said John Simley, a spokesman. Amazon.com Inc. extended by one day, until Dec. 21, its cutoff for standard shipping.Gap, based in San Francisco, increased 6 cents to $20.91 on the New York Stock Exchange yesterday. LVMH declined 89 cents to 77.46 euros in Paris trading. Seattle-based Amazon.com, the largest Internet retailer, rose $5.19 to $138.94 on the Nasdaq.East Coast SnowStores along the East Coast closed early during the Dec. 19 snowstorm. Twenty-four inches of snow fell on Bethesda, Maryland and 23.2 inches were recorded at Philadelphia International Airport, according to the National Weather Service.Consumers had completed 72 percent of their holiday shopping through Dec. 20, down from 80 percent a year earlier, the New York-based ICSC said Dec. 22.Historically, the 10 days before Christmas have made up as much as 40 percent of total holiday sales for November and December, according to Joseph Feldman, a managing director at Telsey Advisory Group in New York.Sales fell 13 percent to $6.9 billion on the last Saturday before Christmas from the previous year, according to Chicago- based researcher ShopperTrak RCT Corp.Some of lost sales did translate into online purchases. Sales at Web sites jumped 24 percent on Dec. 18 and Dec. 19 from a year ago, according to Coremetrics, a San Mateo, California- based marketing company.Some impulse buying and so-called self-purchases, however, were irretrievably lost during the storm, Richard Jaffe, an analyst with Stifel Nicolaus & Co. in New York, said in a Bloomberg Radio interview on Dec. 22.