Fed weighs ‘significant uncertainties’ over global economy

Central bank officials stress rates outlook could shift in either direction

Sam Fleming in Washington


Jay Powell, chair of the Federal Reserve © AP


Federal Reserve officials kept their options open for the next move in interest rates in their latest meeting as they weighed “significant uncertainties” over the US and global economic outlook.

Minutes to the latest Fed meeting in March indicated a high degree of uncertainty over the policy prospects, with some officials stressing their outlook could “shift in either direction” as they seek to determine whether a weak bout of growth will persist.

After raising interest rates four times last year, the US central bank executed a sharp shift in direction in early 2019, shelving prior plans for further rate rises as it moved to a “patient” stance.

President Donald Trump’s willingness to trample over the Fed’s independence with calls for it to cut rates and restart quantitative easing is only adding to the complex outlook policymakers now need to navigate.

In the policy meeting on March 19-20 the Fed held rates at between 2.25 and 2.5 per cent and policymakers trimmed their forecasts for growth this year. Many traders have begun betting that the Fed’s next rates move will be down, and the minutes hinted that this possibility is in the minds of at least some policymakers.

“Several participants noted that their views of the appropriate target range for the federal funds rate could shift in either direction based on incoming data and other developments,” the minutes said.

While most policymakers said rates would probably be kept on hold for the remainder of the year, others still insisted a further upward move was possible.

Treasury and equity prices inched lower following the release of the minutes, as investors responded to the uncertain tone over future interest rate rises, after both markets had seen price increases in the build up to the release.

The benchmark 10-year treasury yield had fallen 4 basis points to 2.46 per cent ahead of the minutes, and rose 2bp to 2.48 per cent shortly afterwards. The S&P 500 edged back from the day’s trading highs, but remained up 0.2 per cent.

Policymakers appeared in the meeting to be broadly sanguine about a recent bout of soggy economic data, with most saying they did not expect the weakness to persist beyond the first three months of the year. But they still expected the growth rate to “step down” from the pace set in 2018, amid factors including a diminishing push from fiscal policy.

Household spending was expected to pick up in the coming months, but the Fed singled out “continued softness” in the housing sector as a worry. Rate-setters dwelled heavily on low inflation, which has remained a persistent conundrum even as US unemployment has sunk to multi-decade lows.

In a lengthy discussion, Fed policymakers theorised that consumer expectations were keeping inflation pinned below the Fed’s 2 per cent target, and that the labour market may not be as tight as headline indicators appeared.

Participants in the meeting talked of a “very flat trajectory” for interest rates reflecting factors including low neutral rates — those rates that are deemed to keep the economy on an even keel.

“Some time would be needed to assess whether indications of weak economic growth in the first quarter would persist in subsequent quarters,” the minutes said. “Members also noted that inflationary pressures remained muted and that a number of uncertainties bearing on the US and global economic outlooks still awaited resolution.”

The central bank also discussed plans to end the reduction of its balance sheet, a process that has been under way since 2017, with policymakers targeting a September end-date. The Fed amassed asset holdings that topped $4.5tn in the wake of the crisis before beginning a very gradual process of allowing the securities to roll off its balance sheet. 
Jay Powell, chair of the Fed, last month suggested the central bank would allow its balance sheet to settle at more than $3.5tn later this year — still dwarfing its size before the crisis. In the meeting officials discussed the pros and cons of slowing its balance sheet runoff before reaching the September stopping date.
 
 
Additional reporting by Joe Rennison in New York



 





Big Data’s Biggest Challenge: How to Avoid Getting Lost in the Weeds

Wharton's Raghuram Iyengar and Evite CEO Victor Cho discuss how firms can optimize their use of data.



Companies have access to more data than ever before. But how can they optimize it without getting lost in the weeds – or losing sight of the customer? Evite CEO Victor Cho and Wharton marketing professor Raghuram Iyengar offered advice from their own experiences during a recent conversation with Knowledge@Wharton. Cho was on campus to host a Datathon with the Wharton Customer Analytics Initiative. Penn students from multiple academic majors were given datasets from Evite and asked to come up with solutions based on the data for improving Evite’s platform and increasing revenue. Evite is among the participants in WCAI’s corporate partner program, which seeks to help companies find ways to better use their data through collaborations with academic researchers, student projects and other initiatives.

An edited transcript of the conversation follows.


Knowledge@Wharton: Victor, you helped engineer a turnaround at Evite after a period of declining user growth and increased competition. How did you use analytics to identify what was going wrong at the company? 

Victor Cho: To give you two seconds of quick history, Evite between the period of around 2008 to 2014, which is when I joined, lost its focus on customers. And this is not something that you want to do when things like Facebook become mainstream, which happened in that period, or when things like the mobile phone become ubiquitous, which happened in that period.

Evite at its core is a social service, highly mobile-centric. So analytics drove our turnaround in really two big prongs. The first — and probably most important — core engine that fixed Evite and brought us back into robust growth, which is where we are now again, was focusing on the customer.

This wasn’t hardcore quant analytics, but it was absolutely customer focused. It was understanding what was the customer usage through the site, what is the Net Promoter score, which is the metric that we use, where are the pain points in the experience. So all of that I would argue is an analytic function applied to customer learning.

The second thing that was critical for us was really building what I call the longitudinal model, but dissecting the customer segments: Who is coming in, where are they coming in from, what is their return rate over time?

We have a model that can project out over two to three years whether the fixes we are putting in place are actually going to translate into downstream growth as opposed to short-term Net Promoter impact. So those are the two big levers that we use.

Knowledge@Wharton: I would imagine that you have a lot more data available to you today certainly compared to when Evite launched, and maybe even from a year or two ago. How do you try to wade through that data and identify what is important, what is not important, what should be focused on, and what shouldn’t?

Cho: It’s funny, we have a ton of data available, and one of the things that I have to do perpetually is to tell people — this sounds weird, but not to look at the data. They are going way too granular into datasets when they are actually missing the bigger question.

One great example of that is the first year that I joined, as I mentioned, we used Net Promoter, which is actually at a high level a fairly crude system for fixing experience. I mean, it is nothing more sophisticated than getting a raw stream of input from your customers [saying] what is working, what is not working, and you calibrate that against a 10-point scale. You don’t need statisticians to do this work.

And the teams wanted to always go deep. It was like, well let’s go deep in this conversion funnel, and I kept having to say, no, no, no, I don’t care – again, this sounds weird, I don’t care about the conversion on the site because we have 50 bugs and hundreds of customers coming in and saying that the mobile phone [app] doesn’t work. So we’re not going to look at any of that until we fix these things at a high level.

Knowledge@Wharton: So it’s thinking about what data can be used to actually fix customers’ pain points before you go into something really complex?

Cho: Exactly. We literally didn’t start looking at deep conversion funnels until year three — until our Net Promoter got up into the 80 range, which is now world-class, and I felt comfortable that, yes, the core foundational experience is good. Now let’s go optimize and tweak. And then of course, we built all of the deep conversion funnel analysis that you would expect.

Knowledge@Wharton: Because if I am the end user and my app doesn’t work, I don’t care about the funnel.

Cho: Yeah, you don’t care if the button is green or blue, exactly. You had people saying, oh let’s go change, let’s go test AB, let’s go put in this crazy AB testing functionality, [but we thought] we don’t need to do that either; we just need to get the apps working.

Knowledge@Wharton: Now Raghu, the Wharton Customer Analytics Initiative partners with companies like Evite regularly to look at datasets and examine how they can use analytics to achieve business goals. How does what Victor described about the analytics journey at Evite compare to some of the questions you get from other companies?

Raghuram Iyengar: What Victor said is pretty much on the money. What I have seen companies doing many times is they get so deep into the data they forget the problem itself. While on the one hand there is a lot of richness in looking at data, I think it is very easy to in some sense get lost in the weeds.

You have to step back a little bit and think about what the business problem is, understand what the low-hanging fruits are, and then start digging deeper. Start with a business problem, work backwards, and then understand what analytics is required at what point in time, and structure the analytics in that way.

So I very much agree with what Victor did in terms of trying to understand what the end user experience is. What are the big pain points? Let’s first figure that out because before figuring that out, that data actually might not be very good quality. So let’s clean that out first, and then start looking deeper into optimizing any experience that might be there.

It is absolutely very much the experience that I have seen over the years, that many companies get so deep into the data that they start forgetting what the big-picture managerial problem is. I am glad to hear that at Evite for example, they started with the problem first, cleaned up the data after that, and then started optimizing. 

Cho: Yes, and I am a huge data junkie, so I also want to make it clear that there is massive power in the more sophisticated data work. I think of it almost like a wedding cake. The very bottom of the cake is — we use Net Promoter, right? Your core engine: Is the customer experience good?
The next layer of the wedding cake is your tactical optimizations. Things like conversions, AB testing, etc. We’re actually past those two layers, and now we’re in the third layer, which is how do you actually untap really innovative business opportunity from data. We just recently hired a VP of data science and business intelligence. We weren’t ready for that role two or three years ago. Now we are; we are building out a data science team and a higher order team, because we do think at this stage mining our data will take us to the next step of innovation.

Knowledge@Wharton: Tell me a little bit more about the top of that wedding cake. With this new role and trying to integrate that into the rest of your operations, what do you think is next in terms of data challenges for Evite?

Cho: The great thing for us is we actually don’t even know what it is — we have concepts of what is possible, but there is not a singular focus where I am going to tell — his name is Jerry — where I am going to tell Jerry, go do x, right? To a large degree, we are going to be in an exploratory mode of, hey we have this incredibly rich dataset, there are lots of different vectors, let’s go figure out which ones might bear fruit.

That’s why I am super excited about this Datathon. I am actually selfishly hoping we get some interesting tidbits, that [sense of] ‘Wow we never even thought of that.’ Maybe that becomes a vector of exploration for untapping business opportunity.

Knowledge@Wharton: That was actually what I was going to ask you next. Tell me a little bit about what led you to do the partnership with Wharton Customer Analytics?

Cho: The history is I got invited to speak at one of the conferences here [in 2018] and got introduced to the organization. And I love what it is doing at a high level. I was a weird undergrad in that I actually constructed a nonexistent concentration at Wharton around statistics, because I felt like that was really the more powerful learning versus getting deep into finance. To me, finance was just an application of statistics.

I told Raghu I was jealous; if I were coming through Wharton now I would want to go through this curriculum and program because I think it is so powerful. I was enamored by the work that they were doing, and I love giving back. There was an opportunity for us, Evite, to partner with Wharton in a way that creates value for both sides. Help students, help us — it’s just a win-win.

Knowledge@Wharton: Raghu, what is the value for Wharton Customer Analytics and for the students to work with these real-world datasets?

Iyengar: We need organizations like Victor was mentioning who really want to give back to the school in the sense of enriching the next generation of people who would become data analysts, become people who would transform the analytics world.

And the way of doing that is learning by doing. And so that is what we firmly believe in Wharton Customer Analytics, that the only way of learning analytics is by doing it. And how you do it is by partnering with companies who are facing real challenges, so students get exposed to real datasets with real managerial problems.

They work on them, they understand what the different pain points are, they understand what the different levers are, and then they come back with actionable solutions. So working with companies like Evite, that is the only way to move forward in terms of exposing our students to real-world problems.

Knowledge@Wharton: What type of data did the students work with for this event?

Cho: At Evite, we have an interesting site in that we have a long tail of different party types that are flowing through our system. So it was kind of a trick to find out — you don’t want to dump massive petabytes of data on these guys. And so the question became, what subset of data are we going to give them?

We ended up with effectively a three-pronged dataset. We have some parties in our system that are what we call seasonal, they kind of happen in bursts. For that, we give just one party type, which is barbecues, which tend to happen in the summer. So the students will get a dataset around barbecue parties.

We have some parties that happen throughout the year, they’re enduring, and we gave them a dataset of a party that I didn’t even know existed, but there is this thing called pet parties. People throw parties for their pets. It is actually a fairly large category, so they get to play around with that.

And then we have these spiky, kind of one-time events. For that, we are giving them a Father’s Day dataset. They will have access to that data, and they will also have access to — in some ways what is more interesting — the downstream behaviors of people who were invited to those parties so that they can hopefully help us understand what are the dynamics driving exposure to a party versus downstream behavior.

Knowledge@Wharton: When you say downstream behavior, is that just RSVPs, or is that how they interact with the invitation, or both?

Cho: It is all of their subsequent behavior. So for someone who goes to a pet party who wasn’t a customer and now becomes a customer, what does that curve look like?

We have a pretty simple host-to-guest engine. So you are a host, you throw the party. You become a guest, and then we hope that by being exposed to our wonderful service, that at some point when then you go throw a party you will think, oh yeah, Evite, that was a great experience, I want that experience for my guests. So you convert from a guest to a host. So yeah, [the students] will have a dataset that tracks all of those behaviors to understand some of the longitudinal dynamics.

Iyengar: From a student’s perspective, they get to see what the different customers are like. In the Evite case, for example, you might start out by looking at some customers who are guests to a party, kind of going along with what Victor was saying.

At some point they might say, well, I just went to a pet party. I also have a pet; I would like to host a party. So what do students see from this, how do customers transition in different types of dynamics? So from a student’s perspective, it is seeing a dataset that is very dynamic, very rich, and very similar to perhaps other kinds of datasets that they might see in their own careers.

Secondly, and perhaps more importantly, what they get out of this is not just the analytics per se. It is also how to present back to the company. This afternoon, [after] these teams have analyzed the data, and they will come in and present back to Victor and Jay [Neuman, Evite’s vice president of data science]. They have to be very articulate about what the problem was, how they went about doing it, and translate what they have found into actionable insights.

I think that is a very important piece that many analysts somehow don’t seem to gather. They are very good sometimes at analyzing the data, but a very critical step, in terms of actionability, is asking how do you convey the results back in a way that a company can actually work with?

Knowledge@Wharton: You have done other Datathons in the past with other companies. What are some of the things that students have come up with?

Iyengar: There are many, many examples. One of the Datathons we did was for the Hertz [rental car] company. They had given us a dataset which was quite rich, extensive, very much like the Evite dataset, where they had information on the Net Promoter score for example, of people who were serving customers. So this is from the salesperson’s perspective. How did the salesperson do in different locations like, for example, on airport, off airport…

And they wanted to get a sense of the dynamics. How are people, for example, changing over time? Do some locations have better service quality than others?

Students, when they first see data of this kind, tend to be overwhelmed a little bit. So they have to understand how to sit back a little bit, understand what the data structure is, understand what the key problem is, and then dive into the data to do the appropriate kind of analysis, as opposed to kind of trying to find the needle in the haystack.

Knowledge@Wharton: Victor, do you have any expectations as far as having fresh eyes on Evite’s data?

Cho: I am expecting a revolutionary innovation. I’m just kidding — I have zero expectations. I just love seeing fresh eyes on data. I think I would be incredibly happy even if we get a hint of a direction of exploration that we haven’t thought of. That would be a crazy success from my perspective.

Iyengar: One of the powers of Wharton Customer Analytics is that we expose a lot of students to a dataset of that kind. And when I mean a lot of students, it is students coming from different backgrounds. These are the Wharton students; they are also people from engineering, they are people from economics.

To Victor’s point, I think what is always great when we look at Datathons of this kind is that the business school students from Wharton have one perspective, the engineering school students have another perspective. So, many different perspectives converge to then find an actionable solution.

Cho: We find there are some very smart people who will come in, and they are just missing that fundamental, for lack of a better term, business acumen [for understanding whether] we are solving the right problem at the right level, or whether we should be thinking about this problem differently.

A lot of times, if you have a toolkit, you are just excited to go in and show that you can do stuff. And we’re [thinking], well that’s cute but what are we going to do with this? It is just going to sit there on the shelf as an interesting analysis, as opposed to, oh wow, here is a new product experience that we can actually bring to market because of this insight.

Knowledge@Wharton: After the Datathon, what’s next for the partnership for both of you?

Cho: We are a multi-year sponsor with the program. So we actually just had a breakfast this morning where we were talking about all of the different potential ways we might be able to come together in future venues. It will be an ongoing, robust relationship.

Iyengar: There are multiple ways in which we collaborate with companies like Evite.

We have the Analytics Accelerator program. This takes place typically every fall. Let’s say Evite, for example, wants to participate in that: This would be a situation where Evite works with a group — there’s about four to five students — over a month-long period where they deep dive into the data and come back with actionable solutions. So that is one example.

Datathons, of course, are another example. We also have an Executive Education program, where we are targeting, for example, mid-level executives who want to be data translators.

These are people who are not analysts, but who want to take the results that analysts are doing and make them actionable, make them understandable to the C-suite.

We have been talking to Evite to see if they would like to partner with us in those programs as well. A host of different things. Another example is you have the Wharton Analytics Fellows.

These are groups of undergrad and MBA students who have the analytics club. They work with companies as well to solve business problems. There are lots of different exciting opportunities going forward.


Deeper Cracks in China’s Housing Foundations

Too much investment, too few buyers could mean trouble ahead

By Nathaniel Taplin




China’s long housing boom—the main support for growth over the past two years—is starting to show its age. That could mean trouble in the near term for bubbly copper prices, up around 10% this year, and lead to easier monetary policy later in 2019.

Property data released tomorrow will likely show house prices still rising, but the outlook is darkening, with land prices already showing pronounced weakness. Nearly every piece of housing-related data for January and February signaled trouble, the major exception being nominal property investment growth, which accelerated. Output growth weakened sharply for glass, power, nonferrous metals and cement. Housing and land space sold was down from a year earlier, the latter by 34%.

Most important, official data showed vacant, unsold residential floor space rising for the first time in two years—a sign the fall in inventories that has served as a key support for new construction since 2016 is probably at an end. The gap between investment growth and sales growth—11.6% and negative 3.2%, respectively—was wider than in all but one month since early 2015. The combination of strong investment and weak sales suggests restocking, and is unlikely to be sustainable.
It’s always dawn before the darkest.

It’s always dawn before the darkest. Photo: yawen Chen/Reuters


Inventories remain low compared with the enormous glut of 2014, which presaged the next year’s housing slump. That suggests the looming downturn will be shallower too, as long as policy makers don’t try to delay the pain by opening the stimulus floodgates for another building spree.

Policy makers are in fact signaling an easier stance on the margins, though there is little sign yet of “irrigation-style” stimulus—the type forsworn by senior officials, including Premier Li Keqiang. China’s housing minister Tuesday said the housing market had recently become more “rational”—before reasserting Beijing’s firm line against speculation. Local governments in Zhejiang, Guangdong and Shandong have eased some housing restrictions in recent months.

None of that amounts to a big housing stimulus, but the market is coming off the boil—giving China’s central bank more room to ease, should it see the need. If exports don’t recover in the second half and overall investment remains weak, the People’s Bank of China could yet decide to make it rain.


Response to Comments: On Leadership and Necessity

By George Friedman

 
I’ve now shared three fragments of my next book on geopolitics with GPF readers. One on philosophy and geopoliticsone on historical determinism and one outlining a new geopolitical model for the current global system. The first intrigued readers and the third left you cold. But the second raised hackles, so it’s the second I will focus on here. It’s clear that I will need to be careful with the book’s section on determinism – I thank you for the warning.
The core issue of geopolitical determinism is the degree to which individual leaders control events. My argument is that leaders have some influence on secondary issues, but they are trapped on the primary ones. It is easier to make this case by example (although no single example is sufficient, and a coherent theory has to evolve). Let me use the case of the U.S.-Japanese conflict in 1941-45 to try to show you how it was inevitable regardless of who led these countries at the time.
The United States, situated as it is on the North American island, is secure so long as no power can attack it from the sea or cut off its trade routes. Japan, on a string of much smaller islands, has almost no natural resources and exists as an industrial power only by importing virtually all of the industrial minerals it needs. In 1941, Japan was importing most of these minerals from Indochina and what is today Indonesia. The sea lanes it used ran past the Philippines, which was then controlled by the U.S. If Japan could gain control of the Western Pacific, it would be massively strengthened. U.S. control of the Pacific would depend on the same islands, such as Tarawa or Saipan, that the U.S. would later break through en route to Japan. Japan, then, could potentially control the Pacific and put the U.S. in danger. Both sides understood the danger they were in.
 
 

When the Japanese invaded China in the 1930s, the U.S. sent aid to China. However, when the Japanese invaded Indochina in 1940, the U.S. became truly concerned. Japan had treaties with France and the Netherlands that facilitated the delivery of raw materials from Indochina and the Dutch East Indies (present-day Indonesia). When, during the course of World War II, Germany overran both France and the Netherlands, control over these countries – and thus the status of their treaties with Japan – became uncertain. Japan felt it had no choice but to invade Indochina and make plans for taking Indonesia; it could not be an industrial power without their raw materials.
The U.S. wanted to avoid war but could not let Japan take control of the Western Pacific. The American solution was to cut off sales of oil and scrap iron to Japan and to send agents to buy up all the Indonesian oil possible before the Japanese could get to it. The U.S. then moved toward a diplomatic settlement with Japan, in which the U.S. retained power to strangle the Japanese economy but agreed not to do so, as long as Japan did not make any aggressive moves. If the Japanese accepted this proposition, their county would exist only by American goodwill. This was impossible for them to do. Neither side was surprised when the Japanese used diplomacy to buy time, and the U.S. moved toward war footing.
The Japanese could neither withdraw from Indochina nor allow the U.S. to control Indonesian oil. But the Japanese could not secure these areas unless they controlled the Philippines, since U.S. air power and a fleet in the Philippines would be able to cut critical Japanese supply lines. Japan also knew that if it seized the Philippines, the U.S. would respond by sending its fleet to the archipelago while the Japanese still hadn’t consolidated their control there. (In fact, the United States’ War Plan Orange anticipated this.) Japan, therefore, had to attempt to destroy the American fleet at the beginning of the war. Hence Pearl Harbor.
The U.S. anticipated neither the fall of France and the Netherlands nor the desperation this development would cause in Japan. When the U.S. finally recognized the threat, it was not ready for war, and the Navy was tied up in the Atlantic. The U.S. knew it was vulnerable, and the Japanese knew they had a small window of opportunity. But the decision to strike Pearl Harbor had less effect on U.S. capabilities in the Pacific than most think.
The Japanese were going to win the first round of battle, according to U.S. war plans. It was in the second round, when U.S. manufacturers started cranking out ships, that the U.S. could respond. In the meantime, the U.S. made only token attempts to defend the Philippines, and no real effort to hold Indonesia. It couldn’t. Rather, it sought to keep open the lines of communication with Australia and to use Australia and Hawaii as bases from which to counterattack.
Note that I have thus far left out the names of the U.S. president and the Japanese leaders. Their hands were tied in two ways: First, in the reality of the Pacific, and second, in the institutional realities at home. Being unable to attack Japan, U.S. President Franklin D. Roosevelt had no choice but to hold steady first and then attack. Not attacking at all was not an option; the political system was shaped by the unfolding events. Similarly, neither Emperor Hirohito nor Prime Minister Hideki Tojo ruled Japan; rather, it was ruled by a complex of interests, all of which were highly sensitive to Japan’s economic situation and which demanded pre-emptive action. Each country’s strategic logic was closely paired with a parallel institutional logic. Neither Roosevelt nor Tojo had the power to act in any way other than the way he did. They had some discretion in the details, but in their broad strategic considerations, they could not have resisted their institutions, though as excellent politicians, they had no desire to.
The argument I am making is not that Roosevelt and Tojo are irrelevant. On the contrary, they were indispensable agents of their nations. A successful leader understands the constraints under which all human entities operate – whether individuals or nations – and having understood them the leader acts, because if he failed to act, leaving a nation in potentially dire straits, he would cease to be the leader. I am also arguing that leaders go through an extended vetting and training process that forces them to understand the disciplines of political rule and the realities of their nations. The more they understand these things, the more powerful they become. So political power does not free leaders to be arbitrary but drives them to understand what they must do. Obviously, there are endless minor issues on which they are free to indulge themselves. But when faced with existential realities, leaders respond as they must.
So, the argument I am making is that we must have a more sophisticated understanding of what a leader is. A leader becomes a leader because of a ruthless understanding of the nature of the nation and remains a leader by pursuing the nation’s interests. If the leader excessively exercises self-indulgent behavior, the system – democratic, totalitarian or otherwise – crushes the leader through the forces that he or she set in motion. Nations generate the regime, and the leader emerges from and serves that process. Those who view leaders from afar may fancy them to be free to do as they wish, but that is the illusion of distance and not the reality. So, if Roosevelt or Tojo had passed away in 1940, the broad strokes of history would have remained.
Adolf Hitler catapulted to power because of the configuration of the German nation after World War I. He did not create his power but aligned himself with the power of the nation. This is easier to see in his foreign rather than domestic policy. But even in Hitler’s case, he was able to become Germany’s leader in 1932 but could not have done so in 1900. Reality created him, and he served it.
The idea I am putting forth is still merely a sketch, and it is not unique to me. Machiavelli made the case that a prince can govern only if he understands what he must do and is good at doing it. Others, like Georg Hegel and, to some extent, Thucydides, made similar arguments. I have here applied the theory to a familiar case from World War II. But when we consider Stalin or Charlemagne or Hannibal, we find that the source of their power, too, was their understanding of what had to be done. Apart from that, they would not have had power.
The entire concept of forecasting derives from predictability. I have had some success in forecasting events because I paid little attention to the personality and quirks of whoever led the nation at the time but focused instead on the more predictable forces that generated leaders and their imperatives.
This is the heart of what I mean by geopolitics, but I’m still trying to frame this satisfactorily, so, by all means, argue with me. Next, I intend to make the case that there is no difference between economics, politics and war, but that they are part of a single dynamic, incomprehensible except in terms of each other.


Personal Health

Colon Cancer Screening Can Save Your Life

With colorectal cancer being found in an increasing number of younger adults, the pressure is on to screen millions more.


                                                                                                                         Credit Gracia Lam
                                                                                          


Although I usually refrain from writing columns linked to national health observances, I believe that Colorectal Cancer Awareness Month, in March, is too important to ignore. There are simply too many people who are still getting and dying from this preventable disease because they failed to get screened for it, including people without excuses like ignorance, lack of health insurance or poor access to medical services.

And as Joy Ginsburg’s experience shows, even some doctors may need to be pushed into encouraging their patients to be tested. Ms. Ginsburg of Leawood, Kan., where she is executive director of an organization that raises private funds for public education, was 48 when her primary care doctor suggested that she have a baseline colonoscopy.

But the gastroenterologist she consulted was reluctant to perform one. “He made fun of me,” she said. “I was not yet 50 and had no symptoms, risk factors or family history of colon cancer.”

Still, Ms. Ginsburg was aware that last year the American Cancer Society had lowered the recommended age to start screening from 50 to 45, so she insisted. And it was lucky that she did. A very large precancerous polyp, the size of a golf ball, was found that required surgical removal along with 40 percent of her colon.

“If I had waited until 50 to get screened, I would have had a very different story to tell,” she said. “Now I’m screaming from the rooftops for everyone to get screened. Having a colonoscopy is a lot easier than getting cancer.”

Five years ago this month, the American Cancer Society and the Centers for Disease Control and Prevention established the National Colorectal Cancer Roundtable with a goal to get 80 percent of all Americans ages 50 to 75 screened for cancers of the colon and rectum by any medically accepted method by 2018. At the time, only about 65 percent of those in the designated age group were up-to-date with an approved test.

Now, with colorectal cancer being found in an increasing number of younger adults, the pressure is on to screen millions more adults in every community of the United States. The current goal is to test at least 80 percent of residents ages 45 to 75 in each community using an approved method. More than 1,800 community organizations have already lined up to help make this happen.

“We’re not insisting that everyone get a colonoscopy, even though it’s the gold standard for detecting and preventing colon cancer,” said Dr. Richard C. Wender, chief cancer control officer at the cancer society. “A lot of people don’t want it, some can’t afford it, and sometimes it’s not available.”

But finances are not the only stumbling block, Dr. Wender said. “Seventy-five percent of the people who hadn’t been screened when the campaign started five years ago had health insurance, many of them through their employers.” 
Other issues include a failure of people “to get started with screening when they reach the appropriate age,” the doctor said. “In 2016, only about 49 percent of adults aged 50 to 54 had been screened.” Another obstacle is that all too often, people go to the doctor only when they’re sick. They’re not focused on preventive care, Dr. Wender said, and neither are many of their doctors.


How the American medical system is organized is yet another obstacle. The typical visit to independent primary care doctors is spent on diagnosis and treatment with little time left for prevention. By contrast, in the Kaiser Permanente system, which has a strong financial incentive to keep its members healthy, 80 percent of members throughout the system — and 90 percent of members on Medicare — have undergone screening for colorectal cancer.


Colorectal cancer is the second most common cause of cancer deaths in this country, with more than 51,000 people expected to die of the disease this year. Although the overall death rate has been dropping for several decades, thanks largely to increased detection and removal of precancerous polyps, deaths among people younger than 55 have increased by 1 percent a year since 2007. This means it’s all the more important to encourage screening among middle-aged adults.

These are the tests currently available:

Colonoscopy. Although this test is the most costly and involved, it is the best not only because it can find an existing cancer, but because it also detects polyps that may become cancer and can be removed during the screening test. It uses a scope and requires a thorough pretest cleansing of the bowel, a step most people find rather unpleasant. But for people without risk factors or a personal or family history of polyps or colorectal cancer, it is usually done only once every 10 years.

FIT. This immunological test, done yearly on a stool sample, checks for blood in the stool, a possible sign of cancer. It usually fails to pick up polyps and thus would not prevent cancer, but it is more effective than the old fecal occult blood test.

FIT-fecal DNA. This test, sold as Cologuard and usually done every three years, combines FIT with markers for abnormal DNA in the stool, making it better able to detect a cancer and advanced precancerous polyps. It misses about half of precancerous polyps.

Virtual colonoscopy. This is an imaging study done every five years via a CT scan that, like an ordinary colonoscopy, requires a thorough bowel cleansing. It visualizes the entire colon, but if anything abnormal is detected, a second prep is needed to permit a regular colonoscopy.
 
Septin 9 assay. This is a serum test done via a blood sample that, although more convenient and less “yucky” than stool tests, is only half as effective as the FIT test in detecting cancer and not at all effective in picking up precancerous lesions.

As important as getting screened for colorectal cancer is the need to avoid risk factors for the disease that individuals can control. These include being overweight or physically inactive, smoking, consuming alcohol immoderately, and eating lots of red meat and processed meat.

Risk factors that are unavoidable include getting older; having an inflammatory bowel disease (ulcerative colitis or Crohn’s disease), or having close relatives (such as parents or siblings) who had colorectal cancer or adenomatous polyps or an inherited cancer syndrome like Lynch syndrome or familial polyposis. Doctors may advise people with these risk factors to get screened for colorectal cancer as often as every year or two.


Jane Brody is the Personal Health columnist, a position she has held since 1976. She has written more than a dozen books including the best sellers “Jane Brody’s Nutrition Book” and “Jane Brody’s Good Food Book.”