General Products

General Products

Profiles About Jobs – Marketing Research

Posted on December 9, 2018 in Uncategorized

Before proceeding into this article, you must understand the concept of marketing research first. In marketing, whenever there is a new product launched or a service is being offered, you must know that there has gone a lot of research work behind them. There has been collection of data, analysis of those data to judge and predict the benefits of both public and the company at the launch of the new product or service. Marketing research is the collective term given to all these marketing processes. It can be divided into two distinct categories. They are the consumer marketing research category and the business to business marketing research category.

The work of an employee in marketing research can be quite tedious but then at the same time can be fun and challenging. The profile of a marketing research analyst can be thrilling. He or she should hold a formal master degree in the field of business administration with specialization in marketing. There are certain things that are expected of an employee working for such a post. They have been given below precisely.

Understanding the market based economy

One has to apply his knowledge of sociology to understand the pattern and trends of market based economy. He or she should be able to interpret the behaviors of target consumers in the market. With his knowledge of statistics the employee must analyze the success of any new product or service launched for this target group. It is important to understand the elements that comprise a marketing mix can have a huge impact on the way a customer would react or respond to any new product.

Should be in a position to take strategic decisions

The employees engaged in analysis market analysis reports should be in a position to take quick and appropriate decisions when required. The decision may concern the potential chances and opportunities in these fields, market segmentation and segregation, marketing performance, implementation of marketing programs and many such fields.

Should be updated with the current scenario

A good decision can not be taken by just a good student. He or she must be in a position to evaluate current situation. He or she should be aware of all the government policies concerning the field of work at that moment. The public policies and technology developments that are taking place which can deeply affect his marketing research work are something about which he or she should be updated.

An important link has to be established by a marketing research analyst between environment and the marketing variable in order to understand the behavior of customers.

There are new challenges always waiting for the marketing research employees for their betterment and to help them grow as responsible workers. They are high paying jobs and there are many opportunities in this field.

Meet the New Software Analyst

Posted on November 24, 2018 in Uncategorized

As US equity markets closed out 2013 at new highs, the future of equity research is facing significant change. With “price targets” being reset for many soaring social, cloud and big data analytics stocks let’s meet the new software analyst. But first, a little background.

Equity research has marginally evolved with investment styles and trading strategies over the past couple of decades. The days of primary fundamental research, particularly on the sell-side, faded long ago. Most analysts don’t have the gumption or the time.

Shrinking commissions and heightened regulatory scrutiny yield lower returns on investment, continuing a cycle of reducing research resources. The sell-side analyst role now has three principal components: 1) to provide access to company managements in their existing coverage universe; 2) to provide coverage for companies that are underwriting clients; and, 3) to provide “hot data points” – particularly for handicapping quarterly results. Buy-siders compete for management access and seek to combine these data points with their own findings to feed trading decisions.

Unfortunately, individual data points legally obtained and disseminated rarely move the needle in providing an adequate sample size on which to base an investment, no less a trading decision. For buy-siders, even aggregating data points from numerous analysts covering a particular sector or company does not provide a relevant statistical sample.

Limitations of today’s analytics

For example, let’s say a mid-sized publicly-traded technology company goes to market with a blend of 100 direct sales teams (one salesperson and one systems engineer per team) and 500 channel partners (mixed 75%/25% between resellers and systems integrators). Further, assume that these teams and partners are dispersed in proportion to the company’s 65%/35% sales mix between North America and international. How many salespeople and channel partners would an analyst have to survey to get an accurate picture of the company’s business in any given quarter?

If a typical sell-side analyst covers 15-20 companies (quintuple that for buy-side analysts), the multiplier effect of data points that an analyst would have to touch makes it humanly impossible to gather sufficient information. Moreover, with 50% of most tech company deals closing in the final month of a quarter, of which half often close in the final two weeks of that month, how much visibility can an analyst have?

Further, why would a company’s sales team talk to anyone from the investment community in the final weeks of a quarter when the only people they are interested in speaking with are customers who can sign a deal? Now consider that many companies throughout the supply chain have instituted strict policies in response to recent scandals to prevent any employee from having any contact with anyone from the investment community.

Even the best-resourced analysts lack the tools to correlate the data points he/she does gather to identify meaningful patterns for either an individual company or an entire sector. Finally, with shorter-term investing horizons and high-frequency trading dominating volume, how relevant are these data points anyway?

The big data approach to research

Stocks generally tend to trade on either sector momentum or overall market momentum. Macro news or events are far more likely to impact a sector’s movement, and therefore a stock’s in that sector. This includes volatility around quarterly earnings – which can run 10%-30% for technology stocks – because the majority of “beats” or “misses” are frequently impacted by macro factors. Excuses such as “sales execution” or “product transition” or “merger integration” issues are less frequent than conference calls would suggest. “Customers postponed purchases” or “down-sized deals” or “customers released budgets” or “a few large deals closed unexpectedly” are more likely explanations.

Now, major sell-side and buy-side institutions are trialing new software that leverages cloud infrastructure and big data analytics to model markets and stocks. Massive data sets can include macro news from anywhere in the world, such as economic variables, political events, seasonal and cyclical factors. These can be blended with company-specific events, including earnings, financings or M&A activity. Newer data sources, including social media, GPS and spatial can also be layered into models. Users can input thousands of variables to build specific models for an entire market or an individual security.

As with any predictive analytics model the key is to ask the right questions. However, the machine learning capabilities of the software will allow the system to not only answer queries but to also determine what questions to ask.

The advantages to both sell-side and buy side firms are significant. They include:

  • Lower costs. Firms can avoid major technology investments by leveraging the scale and processing power of cloud-based infrastructure and analytics software. They can collect, correlate and analyze huge, complex data sets and built models in a fraction of the time and cost that it takes in-house analysts to do.
  • Accuracy. Machine learning and advanced predictive analytics techniques are far more reliable and scalable than models built in Excel spreadsheets. Patterns can be detected to capture small nuances in markets and/or between securities that high-frequency trading platforms have been exploiting for years.
  • Competitiveness. The software can make both sell-side and buy-side firms more competitive with the largest, most technologically advanced hedge funds that have custom-built platforms to perform analytics on this scale in real time. In addition to enhancing performance, the software can be leveraged to improve client services by making select tools available to individual investors.

Analysts become data scientists

The analyst skill set must evolve. They will still have to perform fundamental analysis to understand the markets they follow and each company’s management, strategy, products/services and distribution channels. And they will still have to judge whether a company can execute on these factors.

But to increase their value, analysts will have do statistical modeling and use analytics tools to gain a deeper understanding of what drivers move markets, sectors or particular stocks. Data discovery and visualization tools will replace spreadsheets for identifying dependencies, patterns and trends, valuation analysis, and investment decision making. Analysts will also need a deeper understand client strategies and trading styles in order to tailor their “research” to individual clients.

These technologies may well continue to shrink the ranks of analysts because of their inherent advantages. But those analysts who can master these techniques to complement their traditional roles may not only survive, but lift their value – at least until the playing field levels – because of their new alpha-generating capabilities.

[Top]

Market Research Axioms – If You Remember Anything Remember This

Posted on November 7, 2018 in Uncategorized

The value of a strong questionnaire design when complemented by the task of high quality sample development is not fully appreciated. Often these two essential building blocks of market research are relegated to the back of the line on research projects.

Research Axiom One: You can never fully recover from a poorly written questionnaire.

o No manipulation of the variables, regardless of how cleverly done
o No amount of analysis, regardless of how brilliant
o No degree of insightful interpretation, regardless of intellectual prowess
Nothing can save you from a poor research foundation. The building will collapse like a house of cards!

If there is one part of the research process that I know, it is questionnaire design. It is a task repeatedly given insufficient time and attention. Clients and research professional alike often underestimate the time it will take to develop a truly well structured and concise instrument.

What amazes me most? Project leaders relegate this task to a status depicted by the attitude of: “Once the questionnaire is done we can get on with the important stuff, like analysis and reporting.” The assumption that analysis work is the essence of the research and the expectation that interpreting the results is where the mastery of research ultimately lies is a mystery to me.

Have we not pounded the concept of garbage-in garbage-out into our heads? Can new internet tools substitute for critical thinking and the hard work of aligning the research instrument to the purpose of the study to answer the business questions that sponsors paid to learn?

If this seems like a bit of a rant, well I guess I am guilty. My own research-on-research including the use of a 25-point questionnaire audit system has shown me that even well healed researchers are less diligent about quality than one would hope. Research is not only science it is a craft [perhaps an art] and if the proper fundamentals are not applied the product is less than artful.

I will end this part of my ranting with an analogy [but don’t be surprised to hear more on this topic]. If you have not studied and then practiced writing poetry, would you expect to publish a book of poems simply because your marketing department asked you to? Designing a good quality research instrument probably takes less talent than being a good poet, but it’s close.

Wait, not so fast, we are not done, there is another mistake from which you cannot fully recover. A poor questionnaire design is one possible fatal mistake, but not the only one. Good solid sample development is also necessary. Here is another Research Axiom worth your consideration.

Research Axiom Two: You can never fully recover from a sample that lacks validity; and once again:

o No manipulation of the variables, regardless of how cleverly done
o No amount of analysis, regardless of how brilliant
o No degree of insightful interpretation, regardless of intellectual prowess
Nothing can save you from a poorly developed sample!

The value of sample development is also underappreciated, as are the skills related to creating a valid sample. Project managers, research analysts, and all those who lose sleep over the quality of the sample sources they have available and who work hard to provide the best possible sample for each research project they conduct, are worth their weight in gold.

With numerous challenges to good sample development always hovering over us, if the research team conducting the study does not pay close attention to this critically important task the chances of deriving useful results are likely to diminish rapidly. One of the worst situations to be in, is standing in front of a room full of executives and presenting the research implications when from off in the far corner an executive vice president (EVP) asks you, “Are you sure about that finding? Who were these respondents? They don’t appear to have any knowledge about the market or our products.”

If you can definitively reply, “We believe the respondents in this sample are qualified” and then give a crisp response about the quality control (QC) steps used to verify the validity of the sample, you have saved the day. If on the other hand, you hesitate and cannot defend the validity of the sample, you have lost your audience – there is nothing more they want to hear from you because in their minds the voice of the respondents do not reflect the people they are trying to reach – the day ends badly.

If you do not care about the quality of the research you conduct, well shame on you, but at least recognize that a sample of good quality is a necessity for self-preservation – enough said.

[Top]