From Forbes.com | How To Overcome Misconceptions About Hiring People Smarter Than Yourself

How To Overcome Misconceptions About Hiring People Smarter Than Yourself

 

It has become a common refrain amongst those of us in business that good leaders hire people better than themselves. On the contrary, weak leaders surround themselves with "yes people" — those incapable of challenging them — not too hot, not too cold, but “just right” for the position, as Goldilocks would say. So how do you avoid this trap and hire people with the potential to do your job even better than you one day?

Like running a marathon, I believe that leaders have to train their minds to overcome the default position — not running a marathon. The default, protecting their own position, is likely based on millions of years of evolution, and understanding and being aware of that is important to challenging it. Humans, it would seem, have been programmed by evolution to resist challengers and to build “moats” around their own means of survival (their job, in this case, but also their castles, farms and nations). In a capitalist, merit-based society, however, self-preservation consists of a smaller part "defense" and a much larger part "offense" — it is, in my opinion, always more important to out-compete and out-innovate competitors than it is to hunker down and prevent attacks on one’s own position.

As an entrepreneur and head of growth, I believe that understanding that it’s more important to build a machine that expands the pie for everyone and out-innovates competitors is the first step toward becoming a leader that instinctively seeks out those smarter than him or herself. What applies at the corporate and societal level also applies to the individuals on your team. Part of that involves building up talent from within, not recruiting from outside the organization.

 

 

This need is even more pronounced in the Bay Area.

The Bay Area GDP grew by 5.2% in 2016, according to a report by the Center for Continuing Study of the California Economy, a rate three times the national growth rate. That would explain why we see so many cranes on the horizon. Although it’s important everywhere in the world, building up talent from within your organization is even more critical in areas like the Bay Area.

The hiring decisions we make are crucial to developing internal talent. 

The building block of developing talent from within is hiring for a candidate’s potential, not just the current position. If a manager looks for someone who is “just right” and who won’t challenge their own position, then I believe they risk — and are more likely to create — a stagnant organization without the talent to keep it competitive. On the contrary, if managers learn to hire for potential and develop leaders within their organizations, then I believe they will have faster growth, better morale, fewer recruitment headaches and. best of all, the satisfaction of mentorship.

The numbers seem to speak for themselves when it comes to internal promotion. According to a 2015 Harvard Business Review article, a Wharton School of Business study used data on over 11,000 internal Fortune 100 hires to determine that those internal hires performed better across the board, having "higher competency and contribution ratings (two different measures of performance) during their first year on the job" and a greater chance of being "considered top performers (rated in the Top 25% of the performance distribution of peers in similar jobs)."

If the numbers are so obviously in favor of this approach to people-development, then why do so many find it so hard? Similarly, if we all know it’s important, why don’t we do it?

One misconception prevents managers from hiring the best. 

Amongst data like this and the folklore surrounding it, the reality is that managers are still afraid. Speaking with many first-time managers, I’ve come to realize that the single biggest barrier to implementing this hiring framework is a misconception about what it means.

Most new managers think that “hiring people smarter than you” means hiring people more senior than you. If you’re in a position of, say, vice president, and you feel like to achieve this ideal you have to hire people who are also vice presidents in their current roles, you will understandably feel threatened! Such horizontal hiring is more often done to replace someone than to build out their team, and doing so is a recipe for disaster. So when a new manager sees an application for someone with a title equal to theirs, they think they should interview this person if they are to be true leaders.

The opposite is true — this person is looking for your job now, and that is hopefully not the job you’re hiring for.

The truth about hiring people smarter than you is that they could grow into taking your job and, hopefully, do it better than you. They may have more intellectual horsepower, a better education or just have the drive that you know will lead them to figure it out. It’s this critical distinction — between what they are today and what their potential is — that makes it much easier to understand how to hire with this mindset.

Some of the greatest bodies of inspiring stories in the business world come from hearing about ordinary people accomplishing extraordinary things, like some of the early Microsoft employees that The New York Times reported (paywall) have become millionaires. In my own life, my grandfather started as an employee at a company in Western Canada. He went on to become president, then majority owner of that company. He grew it into the largest regional company in its industry. In my experience, these stories of extreme ownership and incredible value creation that make up the fabric of our economy, prosperity and folklore almost always involve leaders who hire for potential, not position. You can become one of those leaders.

Why American Healthcare is the Most Expensive in the World

THIS BLOG IS IN DRAFT

"There is a simple reason health care in the United States costs more than it does anywhere else: The prices are higher."

"“The United States spends more on health care than any of the other OECD countries spend, without providing more services than the other countries do,” they concluded. “This suggests that the difference in spending is mostly attributable to higher prices of goods and services.”

The result is that, unlike in other countries, sellers of health-care services in America have considerable power to set prices, and so they set them quite high. Two of the five most profitable industries in the United States — the pharmaceuticals industry and the medical device industry — sell health care. With margins of almost 20 percent, they beat out even the financial sector for sheer profitability.

The players sitting across the table from them — the health insurers — are not so profitable. In 2009, their profit margins were a mere 2.2 percent. That’s a signal that the sellers have the upper hand over the buyers.

This is a good deal for residents of other countries, as our high spending makes medical innovations more profitable. “We end up with the benefits of your investment,” Sackville says. “You’re subsidizing the rest of the world by doing the front-end research.”

But many researchers are skeptical that this is an effective way to fund medical innovation. “We pay twice as much for brand-name drugs as most other industrialized countries,” Anderson says. “But the drug companies spend only 12 percent of their revenues on innovation. So yes, some of that money goes to innovation, but only 12 percent of it.”

And others point out that you also need to account for the innovations and investments that our spending on health care is squeezing out. “There are opportunity costs,” says Reinhardt, an economist at Princeton. “The money we spend on health care is money we don’t spend educating our children, or investing in infrastructure, scientific research and defense spending. So if what this means is we ultimately have overmedicalized, poorly educated Americans competing with China, that’s not a very good investment.”

Philip Bonner:

"75% of the world’s medical R&D is done in or for the U.S. healthcare system, and that is why.  To put it another way, our system pays for *three times* the medical advancement as the *rest of the world put together*.  Canada once had a thriving medical R&D industry; when they socialized their system that industry largely shut down and moved one country south.  But now, there is nowhere else left for that all to move to.  If we shut it off here, it’s gone.

A very large part of the reason our healthcare system is so expensive compared to others that seem to have equal or better outcomes is exactly this problem – that we bear the burden of paying for the world’s medical advancement, and that other systems take advantage of that without paying their fair share of it.  But that leaves us with only three options – (a) keep paying for it, in which case our system will continue to be more expensive than anyone else’s, (b) socialize our system and stop paying for it, which will result in *no one* paying for it, and thus then it not getting done, or (c) make the other systems pony up and pay their fair share, which would be the best answer.

The majority of the money for the world’s medical advancement is recovered from the American market.  Most drugs cost more to develop than to make.  If you have a drug that costs $5 a dose to make, but cost $100 a dose to get to market (in the U.S., amortized under our current shortened patent timer), then it will be $105 a dose.  If some other country then decides that they will pay $25 a dose for it, then the company can sell it there, make back $20 a dose on the extra doses, and then the price here can go down to $85 a dose.  So generally, the companies do go ahead and do that.  But that still leaves us paying more than three times as much, and funding most of that development cost.  Whereas if that other country had to pay a fair market price, the cost could be evenly divided, and it would be $55 a dose in both places.  That other country would still have the option to have a socialized system, and to then provide that medicine to its own citizens for any price it chooses, or for free, but wouldn’t be able to do so at our expense by dumping most of the development cost burden on us in the process.

Solving this would be tough, but most of the other wealthy countries with price controlled socialized systems are members of the WTO.  Maybe a suit through the WTO for restraint-of-trade could be filed that would prevent the governments of those countries from dictating artificially low prices.  Again, those other countries would still have the option to have socialized systems, and to then provide that medicine to their own citizens for any price they choose, or for free, but wouldn’t be able to do so at our expense by dumping most of the development cost burden on us in the process.
=====
A more detailed explanation of drug pricing:

I’m going to create a “test case”, using some actually pretty realistic numbers.  The main concept is that it very often costs FAR more to develop a drug than it does to make it once it has been developed.

Let’s suppose, to make the math easy, that we have a drug developed that cures a condition with one dose.  Once on the market, this drug costs about $25 a dose to make.  Getting this drug approved in the U.S. costs $100,000,000 and consumes 12 of the 17 years of the patent duration (typical amounts).  Getting the drug approved in Europe, which has similar (but not entirely compatible) standards would be about the same; $100,000,000 and 12 years.  Getting the drug approved in both U.S. and Europe would save some money through avoiding duplication, and could be done in the same amount of time, for a total of $125,000,000.  The investors that put up the money for this development expect to double their money (not unreasonable for a 12-17 year investment).  The drug will sell in the U.S. market for what they need to sell it for.  The E.U. declares they will pay $125 a dose.  There are 100,000 patients a year in the U.S. with this condition, and 150,000 a year in the E.U.

For the sake of simplicity, I’m going to ignore advertising costs and such for this example.

So here are the options for the company:

Seek approval only in Europe.  150,000 patients a year times five years of patent = 750,000 patients.  $125 a dose, minus $25 a dose to produce, = $100 a dose return; times 750,000 patients = $75,000,000.  Not economically viable.  To return the costs, $200,000,000 (development + investor profit) divided by 750,000 patients = $267 a dose, plus production costs = $292 a dose.

Seek approval only in U.S.  100,000 patients a year times five years of patent = 500,000 patients.  $200,000,000 (development + investor profit) divided by 500,000 patients = $400 a dose, plus production costs = $425 a dose.  Economically viable if the condition is serious enough that this price is worth it.

Seek approval in both markets.  $250,000,000 (development + investor profit), minus the $75,000,000 return from the price controlled E.U. market, leaves $175,000,000 to recover in the U.S.  $175,000,000 (development + investor profit) divided by 500,000 patients = $350 a dose, plus production costs = $375 a dose.  So this helps the price in the U.S. some, and is worth doing, even though it still costs three times in the U.S. what Europe has decreed it will pay.  This is the scenario that is most common.

If both markets were free, then the price would be: 250,000 patients a year times five years of patent = 1,250,000 patients.  $250,000,000 (development + investor profit) divided by 1,250,000 patients = $200 a dose, plus production costs = $225 a dose.  Quite a bit less for us; not much more for them if they pick up their fair share for the five years left on the patent.

After that, a generic gets sold to everyone for $50 a dose, and everyone benefits (but only if it was economically viable to develop it in the first place).

If we talk about patent duration reform, that transforms all these numbers a lot, and is a whole new set of scenarios.  But I hope this helps you understand how the U.S. market winds up subsidizing the socialized medicine systems of other countries, and thus why medicines cost more here."

 

Other:"

I would only point out that other countries piggyback — to use your apt word — not only on U.S. defense spending, but on U.S. healthcare spending too. Our drug costs are far higher than in other parts of the world. Why? In part because the U.S. government does not bargain with pharmaceutical companies to the extent that other nations do. Given the sheer heft of government purchases, via Medicare and Medicaid, we have what’s called “monopsony” power. Practically speaking, this means the U.S. government can force drug companies to lower their price. But we do not. By stark contrast, other countries do.

Since we don’t, this means that, practically speaking, we Americans subsidize the development of drugs that other countries can buy more cheaply for their citizens, since in almost all other countries, health care is national and is bought in volume by their governments.

The conservative jurist Richard Posner argued in 2009 that the government should use its monopsony power to muscle Big Pharma into lowering its prices. But he acknowledged that “[t]he drug companies in turn would reduce their output.”

I, meanwhile, have long taken the position, perverse in the eyes of most people I know, that — to put it bluntly — the more drugs, the better. (Legal drugs, I mean.) In other words, I do not want drug companies to decrease their output.

----

Ken Thorpe:

"

Patients throughout the globe benefit from the pharmaceuticals, medical devices and innovative treatment approaches developed by U.S.-based manufacturers and healthcare providers, but pay less for them because their countries make liberal use of government price controls.

Is it fair that other nations don’t pay their share for the drugs, devices and treatments that yield longer and healthier lives? Of course it’s not. Should this be a priority in trade negotiations between our government and their counterparts in other capitals, ensuring that prices are set by the market and not by arbitrary price ceilings? Certainly. Spreading the financial responsibility for innovation among all those who benefit from it would lead to more equitable pricing worldwide and greater benefit ultimately.

Until that occurs, though, it’s foolhardy to point a finger at just one healthcare sector, and blame it for the high price of modern medicine. The fact is that pharmaceuticals represent around 10 percent of the nation’s total healthcare spending – and have been for more than 40 years. The drugs administered for a host of diseases, from cancer to diabetes to heart disease, many times actually save money by preventing more expensive acute procedures. Given that the primary outcomes of medical innovation are longer life, less disability, and better quality of life, it perplexes me that these benefits are almost always ignored in the analysis of prices of any number of medical goods and services." http://thehill.com/blogs/congress-blog/healthcare/220330-the-world-is-stiffing-the-us-on-healthcare

http://fortune.com/2015/11/03/us-europe-healthcare-gdp/

https://www.cato.org/publications/policy-analysis/bending-productivity-curve-why-america-leads-world-medical-innovation

https://www.forbes.com/sites/realspin/2017/12/15/a-commonsense-compromise-on-health-care-free-market-coverage-for-everyone/#756ad7e15da2

https://www.heritage.org/health-care-reform/report/compelling-evidence-makes-the-case-market-driven-health-care-system

 

 

https://www.washingtonpost.com/blogs/ezra-klein/post/why-an-mri-costs-1080-in-america-and-280-in-france/2011/08/25/gIQAVHztoR_blog.html?noredirect=on&utm_term=.4004a804ef59

https://www.bloomberg.com/view/articles/2017-07-06/the-world-doesn-t-mooch-off-u-s-health-care-research

https://www.forbes.com/sites/peterubel/2014/04/18/the-real-health-care-subsidy-problem/#1129bb732881

https://www.ibtimes.com/how-us-subsidizes-cheap-drugs-europe-2112662

https://arcdigital.media/u-s-health-care-reality-check-1-pharmaceutical-innovation-574241fb80ba

Is Native Advertising Dead?

As Vice President of Sales at MyFinance, the leading native advertising platform for financial services, I came to know well the native space.

 

Na-tiv-e Ad-ver-tis-ing: "material in an online publication which resembles the publication's editorial content but is paid for by an advertiser and intended to promote the advertiser's product." Ex: "native advertising is blurring the lines between advertising and content"

New Definition: "Native advertising is a form of paid media where the ad experience follows the natural form and function of the user experience in which it is placed."

Difference? On the surface, not much. Native ads fit in with the form and function of the page, making them more effective. Facebook Ads are native ads. Google Search results are kind of native ads. And in-line content ads such as those on major publishers that, but for the small "Sponsored" text, look exactly like another article, are definitely native ads.

The new definition means that native advertising now encompasses an increasingly large share of all ad spend. In fact, Ads that follow the form and function of the user experience - for instance, the experience of seeking out interesting content while scrolling through one's social feed - now represent 60% of all Ad spend. So much for Native being a niche. So why are so many still hocking banner and making it sound sophisticated by saying it's programmatic? And why are so many large companies putting budget into it?

Being at the front lines of the growth, and therefore agency and brand relationships, of the fastest growing Native AdTech company at the time, MyFinance, I was accustomed to widespread confusion about the definition of native: Most major publishers still think native is a sponsored content article or something requiring the use of their "studio" (starting at just $50,000;)! ). Definitions are important in business because they remove the need for unnecessary pause and explanation.

Why most major publishers have no clue is beyond the scope of this blog, but let's focus on those who do think it's important and know what it is - 80%+ of them think native is too risky. Just last week, a major publication condemned native as "still on trial" because integrated ads leave consumers "not impressed". More thorough research, however, has indicated consumers enjoy native units far more than distracting forms of advertising (banners). Indeed, the single fastest-growing ad type last year, seeing 46% year-over-year growth, is a form that the IAB has termed "Native-Style Display".

The "old" definition of native excludes this engaging ad format, merely describing sponsored editorials. Sponsored editorials are a branding exercise in a world where the majority of advertising dollars go towards direct response, and where the majority of those dollars go to Facebook and Google. Native-style display matches the form and function of the user's experience and, ideally, is intelligent such that it pushes them further down a funnel or path that they opened their computer to accomplish in the first place. That's what we did at MyFinance and that's what consumers seek out in ads - that's what they respond to.

The native "boom" is long overdue. With the rise of mobile, the effectiveness of display ads on much smaller screens plummeted (any ad large enough to be seen must necessarily consume the entire screen, becoming even more distracting and upsetting). Native, or in-feed, became what it is today in response to this (The CTR on Native ads on mobile is nearly 700% what it is on desktop).

Logically, however, native ad buys occur far less frequently through programmatic channels. Unless, of course, one challenges the definition of programmatic as well, expanding it to mean campaigns where data determines whether an ad serves or not and where optimziations occur at scale to meet client KPIs. In that case, every native platform that performs well is programmatic as well. But in the traditional sense, programmatic loses when native gains. And native is gaining. Fast.

So to answer the title, no. But the answer is yes but changing a few words.

Politics - Antitrust law is anti- 'trust in the market'

Antitrust laws are some of the most onerous and costly to the economy, yet most widely supported through their not being well understood. A vast majority of Americans consider monopolies both possible and undesirable, and government edicts are the solution that sounds reasonable to many. 

The problem is, this is completely false.  

Competitors, by the nature, want 'in" and monopolists, by their nature, want to restrict new entrants. There are two ways to restrict entrants in the long run - being better than them and having government impose restrictions on entrance to the market (such as in the market for healthcare, where doctors groups here in Texas have long, and successfully, limited the number of new doctors attending college each year, leading to a severe shortage but also inflated doctor's wages - and I say this objectively as my better half is one of them).

So if we take government (licensing, regulations, capital requirements, outright bans on new entrants) out the equation, that leaves monopolists with only one tool with which to fight off would-be competitors - being better than them. 

So what does "better" mean and aren't there other ways to restrict competition, such as having outsized market power or controlling distribution? The short answer is no, not in the long run. Google is a great example of a company that is simply better than all others at what it does, and in effect has a monopolistic level of market share as a result. But that isn't a bad thing - everyone benefits from their fight to remain the best, as those who use Google well know - that's why they do it. 

Perhaps the best example of the pointlessness of anti-trust is an international one - OPEC. OPEC, as you probably know, is the Organization of Petroleum Exporting Countries, and consists of a group of nations in the middle-east and North Africa with significant oil reserves and, in most cases, a heavy reliance on oil revenues for government coffers and economic stability.  OPEC has controlled prices for 30+ years by meeting periodically to determine how much to produce; in effect, oil being a commodity, and OPEC controlling a large enough share of that commodity, they controlled prices by reducing supply (and, in many cases, simply by indicating to the market that they MIGHT reduce supply).  

Consider the market power of their (temporary) monopoly when, in 1973, they embargoed U.S oil imports in response to U.S support for Israel and, in effect, led to a domestic oil supply crisis. Think lines at the gas station stretching miles...

Government, however, did not break up OPEC's monopoly with some bureaucratic anti-trust ruling at an international court. Competitors, by their nature, wanted "in" and monopolists were trying to keep them out. New entrants, therefore, must be fundamentally better. And along came U.S shale or 'fracking'. 

Antitrust laws are some of the most onerous and costly to the economy, yet most widely supported through their not being well understood. A vast majority of Americans consider monopolies both possible and undesirable, and government edicts are the solution that sounds reasonable to many. 

The problem is, this is completely false.  

Competitors, by the nature, want 'in" and monopolists, by their nature, want to restrict new entrants. There are two ways to restrict entrants in the long run - being better than them and having government impose restrictions on entrance to the market (such as in the market for healthcare, where doctors groups here in Texas have long, and successfully, limited the number of new doctors attending college each year, leading to a severe shortage but also inflated doctor's wages - and I say this objectively as my better half is one of them).

So if we take government (licensing, regulations, capital requirements, outright bans on new entrants) out the equation, that leaves monopolists with only one tool with which to fight off would-be competitors - being better than them. 

So what does "better" mean and aren't there other ways to restrict competition, such as having outsized market power or controlling distribution? The short answer is no, not in the long run. Google is a great example of a company that is simply better than all others at what it does, and in effect has a monopolistic level of market share as a result. But that isn't a bad thing - everyone benefits from their fight to remain the best, as those who use Google well know - that's why they do it. 

Perhaps the best example of the pointlessness of anti-trust is an international one - OPEC. OPEC, as you probably know, is the Organization of Petroleum Exporting Countries, and consists of a group of nations in the middle-east and North Africa with significant oil reserves and, in most cases, a heavy reliance on oil revenues for government coffers and economic stability.  OPEC has controlled prices for 30+ years by meeting periodically to determine how much to produce; in effect, oil being a commodity, and OPEC controlling a large enough share of that commodity, they controlled prices by reducing supply (and, in many cases, simply by indicating to the market that they MIGHT reduce supply).  

Consider the vast market power of their (temporary) monopoly when, in 1973, they embargoed U.S oil imports in response to U.S support for Israel and, in effect, led to a domestic oil supply crisis. Think lines at the gas station stretching miles...

Government, however, did not break up OPEC's monopoly with some bureaucratic anti-trust ruling at an international court. As I stated at the outset, competitors by their nature want "in" and monopolists by their nature want to restrict them. Therefore, new entrants must compete by being fundamentally better. 

  • "Horizontal Drilling is the real marvel of engineering and scientific innovation.  While impressive in its own right, the main innovations in "Fracking" in recent years have been beefing up the generating horsepower to accommodate horizontal wells rather than vertical ones, and refining of the fluids used to conserve water and create better, longer lasting fractures in the target formation."
  • "...the real marvel is the innovation that has take place in the realm of Horizontal Drilling.  Think about what this advancement has meant just in terms of access to the resources:  When drilling into a hydrocarbon bearing formation 100 feet thick, vertical drilling would allow an operator to contact 100 feet of rock, which would limit your potential recovery to whatever oil or gas would flow into that length of pipe.
  • Horizontal Drilling now allows these same operators to drill and set pipe for a mile or more horizontally through this same rock formation.  You are now contacting and "Fracking" 5,200 feet of rock rather than 100 feet, which multiplies expected well recovery rates many times over.  The technology employed is so advanced and exacting that drillers today can hit a target at the end of a drill string that is 10,000 feet vertical with a mile long horizontal section that is no more than a few inches in diameter.  Drillers also use sensors to detect particularly promising rock intervals within the formation, and are able to move the drill string up or down, left or right as they drill through the horizontal section to target those intervals.
  • These extraordinary technological achievements enable operators to maximize returns from each well, which in turn means higher royalty payments to mineral owners, and higher tax revenues for local and state taxing authorities.
  • Advanced horizontal drilling technology also produces positive results for the environment.  A single horizontal well can replace the need to drill a dozen or even more vertical wells to access a similar level of resource.  For the environment this means far less air emissions, far less water usage and disposal needs, and far less land impacted to produce a similar amount of oil and natural gas.
  • Add to all of that the fact that the industry's ability to access natural gas in shale formations, and the supply abundance that has produced, has enabled the conversion of dozens of older coal-fired power plants to cleaner-buring natural gas.  That has led directly to the lowering of US greenhouse gas emissions to levels not seen since the early '90s, a result not matched by any other industrialized nation."

Fracking technology was a market response to a "monopoly", and it eliminated it as such.

Here are some recent headlines:

Crude Oil Prices Extend Drop as OPEC Struggles with Reducing Glut

It's the final showdown: OPEC struggles as the US ramps up oil ..

Why OPEC is struggling for oil supremacy - The Australian

 

3 Tips for Learning From Your Business Failures

3 Tips for Learning From Your Business Failures

By Cameron Jacox

Published October 25, 2012 in Entrepreneur Magazine

"3 min read | Opinions expressed by Entrepreneur contributors are their own.

My startup was born out of multiple failures.

As part of my first year courses during college, 29 of my classmates and I were assigned the task of building our own business. I was elected the CEO and we had to come up with a quality pitch, create a business plan and execute our vision. Our idea was to launch a general store on campus to sell bulk food and drinks to the student community. Market research showed that it would be a hit, given our school’s isolated location. We signed a contract with Pepsi, secured funding and locked down a space in the campus center.

Related: How to Start Up from Your Dorm Room

But then, over Christmas break while preparing to launch during second semester, I received a call from one of my professors. He advised me that the Barnes & Noble bookstore on campus had exclusive rights to the sale of “non-meal plan” food and drinks. I had now run into my first legal snag. The negotiations failed, the bookstore refused to budge and we couldn’t operate. I had to figure out what to do next -- my classmates were counting on me.

Though I've since started a different business that offers life insurance policy analysis software called lifeAssist, that experience was formative. Here are the three tips I learned about turning a setback into an advantage down the road:

1. Be ready and willing to pivot quickly. When our plans for the on-campus general store were squashed, we turned around and launched a shop called Threads & Ink where we sold a range of non-food products. It was up and running in four weeks and we ended the semester with a solid profit, both financially and experience wise.

Related: How to Gain Free Feedback from Your Classmates

2. Find win-wins. The bookstore’s retail food and drink sales had declined that year. They were in no position to accommodate further cannibalization for the sake of good will. In reflection, I should have been more willing to change the fundamentals of our business model. We could have bought food and drinks in bulk through the bookstore and become one of their marketing arms, helping to boost their sales while also making our own profits.

3. Ask more questions. In the end, I found out that one of my professors knew when he approved our business plan that we would run into this problem. He waited and watched. Our professor encouraged learning from experience and was interested in seeing how fast we would be able to turn things around and pick up the pieces. Entrepreneurship is all about taking risks, learning from failure and doing it all over again better."