Africa-focused oil and natural gas producer Tullow Oil reported on Wednesday a surprise annual operating profit after three years in the red, and said it expected first oil from Kenya in 2021 or 2022.
The company reported an operating profit of $22 million for the year ended Dec. 31, compared with a loss of $755 million in 2016. Analysts were expecting a loss of $103.6 million, according to company-compiled consensus.
Tullow said working interest production was 32 percent higher at an average of 94,700 barrels of oil equivalent per day (boepd) in 2017. It forecast 2018 production in the range of 86,000 to 95,000 boepd.
The company said it planned phased development in Kenya, with final investment decision expected in 2019.
Reporting by Arathy S Nair in Bengaluru; Editing by Amrutha Gayathri (Reuters)
Seychelles inflation rose to 4.5 percent year-on-year in January from 3.48 percent a month earlier, the National Bureau of Statistics said on Wednesday.
Reporting by Clement Uwiringiyimana; Editing by Raissa Kasolowsky (Reuters)
Petrochemicals group Sasol completed a 13.6 billion rand ($1 billion) expansion of a wax plant in South Africa that will boost its annual production, the firm said on Tuesday.
Sasol, the world’s top maker of motor fuel from coal, has increasingly diversified into chemicals, gas and clean energy projects, in part to meet a global shift to low-carbon products.
The project to produce wax, used in adhesives and printing, is one of the company’s largest investments in South Africa. It was funded through Sasol’s own cash and is part of a strategy expands its chemicals businesses.
Sasol said it aimed to ramp up production to 137,000 tonnes per annum of wax within the next two years at the Sasolburg site, 100 km (62 miles) south of Johannesburg.
Sasol produced 63,000 tonnes of wax in the six months to the end of December.
The wax, which will be exported, is used in hot melt adhesives to seal cereal boxes or milk cartons and in printing ink products such as 3D printing, adhesives, inks, paints, candles and emulsions.
The company, mostly known for pioneering the conversion of coal to fuel, also produces gas-to-liquids and polymers used for packaging materials among other uses.
Sasol Co-Chief Executive Officer Bongani Nqwababa said production was running ahead of schedule.
The company began construction of the plant in 2015.
“With completion of this project, South Africa is now one of the leading countries of wax production globally,” he said.
($1 = 12.0759 rand)
Editing by James Macharia and Edmund Blair (Reuters)
I began my research career in the last century with an analysis of how news organisations were adapting to this strange new thing called “the Internet”. Five years later I signed up for Twitter and, a year after that, for Facebook.
Now, as it celebrates its 14th birthday, Facebook is becoming ubiquitous, and its usage and impact is central to my (and many others’) research.
In 2017 the social network had 2 billion members, by its own count. Facebook’s relationship with news content is an important part of this ubiquity. Since 2008 the company has courted news organisations with features like “Connect”, “Share” and “Instant Articles”. As of 2017, 48% of Americans rely primarily on Facebook for news and current affairs information.
Social networks present news content in a way that’s integrated into the flow of personal and other communication. Media scholar Alfred Hermida calls this “ambient news”. It’s a trend that has been considered promising for the development of civil society. Social media – like the Internet before it – has being hailed as the new “public sphere”: a place for civic discourse and political engagement among the citizenry.
But, unlike the Internet, Facebook is not a public space in which all content is equal. It is a private company. It controls what content you see, according to algorithms and commercial interests. The new public sphere is, in fact, privately owned, and this has far-reaching implications for civic society worldwide.
When a single company is acting as the broker for news and current affairs content for a majority of the population, the possibility for abuse is rife. Facebook is not seen as a “news organisation”, so it falls outside of whatever regulations countries apply to “the news”. And its content is provided by myriad third parties, often with little oversight and tracking by countries’ authorities. So civic society’s ability to address concerns about Facebook’s content becomes even more constrained.
Getting to know all about you
Facebook’s primary goal is to sell advertising. It does so by knowing as much as possible about its users, then selling that information to advertisers. The provision of content to entice consumers to look at advertising is not new: it’s the entire basis of the commercial media.
But where newspapers can only target broad demographic groups based on language, location and, to an extent, education level and income, Facebook can narrow its target market down to individual level. How? Based on demographics – and everything your “likes”, posts and comments have told it.
This ability to fine tune content to subsets of the audience is not limited to advertising. Everything on your Facebook feed is curated and presented to you by an algorithm seeking to maximise your engagement by only showing you things that it thinks you will like and respond to. The more you engage and respond, the better the algorithm gets at predicting what you will like.
When it comes to news content and discussion of the news, this means you will increasingly only see material that’s in line with your stated interests. More and more, too, news items, advertisements and posts by friends are blurred in the interface. This all merges into a single stream of information.
And because of the way your network is structured, the nature of that information becomes ever more narrow. It is inherent in the ideals of democracy that people be exposed to a plurality of ideas; that the public sphere should be open to all. The loss of this plurality creates a society made up of extremes, with little hope for consensus or bridging of ideas.
An echo chamber
Most people’s “friends” on Facebook tend to be people with whom they have some real-life connection – actual friends, classmates, neighbours and family members. Functionally, this means that most of your network will consist largely of people who share your broad demographic profile: education level, income, location, ethnic and cultural background and age.
The algorithm knows who in this network you are most likely to engage with, which further narrows the field to people whose worldview aligns with your own. You may be Facebook friends with your Uncle Fred, whose political outbursts threaten the tranquillity of every family get-together. But if you ignore his conspiracy-themed posts and don’t engage, they will start to disappear from your feed.
Over time this means that your feed gets narrower and narrower. It shows less and less content that you might disagree with or find distasteful.
These two responses, engaging and ignoring are both driven by the invisible hand of the algorithm. And they have created an echo chamber. This isn’t dissimilar to what news organisations have been trying to do for some time: gatekeeping is the expression of the journalists’ idea of what the audience wants to read.
Traditional journalists had to rely on their instinct for what people would be interested in. Technology now makes it possible to know exactly what people read, responded to, or shared.
For Facebook, this process is now run by a computer; an algorithm which reacts instantly to provide the content it thinks you want. But this fine tuned and carefully managed algorithm is open to manipulation, especially by political and social interests.
Extreme views confirmed
In the last few years Facebook users have unwittingly become part of a massive social experiment – one which may have contributed to the equally surprising election of Donald Trump as president of the US and the UK electing to leave the European Union. We can’t be sure of this, since Facebook’s content algorithm is secret and most of the content is shown only to specific users.
What they’ve found is that the more extreme the views the user has already agreed with, the more extreme the content they saw was. People who liked or expressed support for leaving the EU were shown content that reflected this desire, but in a more extreme way.
If they liked that they’d be shown even more content, and so on, the group getting smaller and smaller and more and more insular. This is similar to how extremist groups would identify and court potential members, enticing them with more and more radical ideas and watching their reaction. That sort of personal interaction was a slow process. Facebook’s algorithm now works at lightning speed and the pace of radicalisation is exponentially increased.