Before reporting on tech companies for Inc., I worked for local newspapers. My office was a physical place where I showed up five days a week and sat among coworkers whose presence could be a little distracting at times, but with whom I developed a closeness and a certain synchronicity.

Election seasons were when that synchronicity really kicked in. Instead of reporters scattering to various meetings in different locations on different topics, we worked as though we were running a relay race, piecing together a complete picture for the paper the next day, often splitting up stories and sharing bylines.

The aspect of that synchronicity I took for granted wasn't the workflow or comradery, but shared experience and assumptions. Even reporting from different campaigns with election results coming in at different times from different places and occasional confusion around outcome, the newsroom had a uniform idea of how elections worked and shared a goal of creating something whole at the end of the night.

In my role at Inc., I work remotely. On days where our three editorial staff in San Francisco don't meet up for coffee or lunch, our interactions are restricted to phone, email and Slack. We don't overhear each other's conversations like we would in a daily workplace environment, or bump into each other on the stairs of an office building (though we occasionally bump into each around town.)

Spontaneous social interactions come not from neighboring cubicles or desks, but from Twitter and Facebook. When social media is the background chatter to your workday, as it is for many who work remotely or from home--entrepreneurs, freelance marketers and copywriters, web developers--you can become sensitive to disturbances in your virtual environment. That's why, when Facebook's Trending feature presented a conspiracy theory to me on Monday without context or explanation, I felt especially uncomfortable.

A conspiracy theory.

The vehicle for that conspiracy theory was a classic nothingburger. Wikileaks on Monday announced it was changing the logistics of an announcement to be made by founder Julian Assange. Instead of the planned statement taking place from the balcony of the Ecuadorian Embassy in London, Wikileaks said Assange would appear over internet livestream to a press conference in Berlin, Germany. The organization cited "specific information" as the reason for the change.

This was factual news, however minor it seemed. Wikileaks actually changed the logistics of an event, citing "specific information," which some speculated to be a reference to Assange's security concerns. The news appeared in Facebook's Trending feature under the topic title, "Julian Assange." Users who clicked through to the Assange trending topic page, however, found a report that Hillary Clinton had sought to assassinate the Wikileaks head. In the "Top Posts" section an InfoWars headline read, "Wikileaks: Hillary Clinton Proposed Killing Assange with Drone Strike."

There was no logical reason to give credence to that headline. classified the claim as "unproven." Fox News gave the allegation one sentence in a story about the logistical change, noting there had been "no recent public revelations directly tied to Assange's [general] security fears." When Assange finally made the announcement that was the subject of the trending topic--that Wikileaks planned a dump of records including content pertaining to the U.S. elections--the Reuters report of the press conference mentioned nothing about a rumored proposed drone attack. I feel silly even bothering to explain how the InfoWars headline was bogus. But there I was, reading a conspiracy theory in a Facebook feature that purports to showcase factual information.

Facebook Trending has suffered from well-documented convulsions. Following allegations in May the trending topics team had been deliberately suppressing right-wing news sources, the company made a point of trying to include news from sources with a wider range of political leanings. Facebook laid off the editorial staff of its trending topics team in August and reportedly increased its reliance on algorithms to determine what constituted trending news. Within a week of the layoffs, the little blue rectangle in the right corner of users' screens promoted reports Megyn Kelly had been fired from Fox News for supporting Clinton's presidential candidacy. Last month, the feature told users the Twin Towers had been brought down by bombs and not airplanes on Sept. 11, 2001.

What bothered me about the InfoWars story was how it slipped so subtly into the mix. I didn't see any headlines blasting Facebook for promoting yet another hoax. It wasn't the main trending topic link, but a link in the Top News section that users would see only if they clicked on the hyperlinked trending topic in the main Trending feature box. And in fact, not all users see the same links. When friends sent me screenshots of how the Top Posts section of the Assange trending page looked to them, there was no drone attack story.

Feeling gaslighted.

The discombobulation and frustration I felt at being narrowcasted bogus news reminded me of my first encounter with virtual reality goggles. A friend whose roommate worked at a tech company developing virtual reality tools let me try on hardware for a VR developer kit. In the demo virtual environment, I sat at a virtual desk with a banana on the corner. At one point, I turned my head to look at another part of the desk. I looked back at the original corner, and the banana had been replaced with a book. I exclaimed to tell my friend and his roommate what I had observed. They laughed, but didn't seem to know what I was talking about. I remember commenting that such unpredictable small changes in a virtual environment could be used as a form of torture. The goggles were gaslighting me.

Facebook and Twitter together comprise my main "windows to the world" during the work week. On Monday, it felt like one of those windows was gaslighting me. Facebook was pushing a conspiracy theory into my feed and treating the false report as though it had equal merit to something like the PBS livestream of the vice presidential debate. The livestream, as I write, occupies one of the two "Top Posts" spots on the page for Facebook Trending topic "Vice Presidential Debate."

When I emailed Facebook about what I had seen Monday, the company explained the Top Posts section is essentially what you would see if you manually searched a topic on Facebook, and results are automated. A spokesperson said the company was working to improve the section. The spokesperson didn't address the drone attack posts, which I had mentioned in my emails.

There's an argument to be made that presentation of links in a search result feature doesn't legitimize them or indicate their level of accuracy. But after months of controversy over how the Trending feature works, Facebook's response sounds like more of an evasion than an explanation. Framing the Top Posts section as semi-random is a way to avoid association with any questionable content in the section. Facebook doesn't want to be viewed as responsible for what its own technology serves up.

Not everyone will react as strongly to this sort of thing as I did. Like I said, I rely a lot on Facebook, probably too much, for glimpses of the world during the workday. Then again, Facebook's newest initiative is an enterprise tool called Facebook for Work. If that works out like Mark Zuckerberg hopes, millions more people will be spending additional hours a day in a Facebook-owned environment -- and in the Facebook-owned environment that already has more than 1.7 billion users, fact and fiction appear interchangeable.

From the beginning, Zuckerberg has couched Facebook not as a fun way to kill time but as an important avenue for connection and understanding. But before they can understand each other, people need to be able to agree on basic facts. Otherwise, all you're doing is isolating them in their own increasingly disconnected realities.