I spent the afternoon with Smallsteps, who now comes directly home from school rather than going to afternoon care, so she has time to burn. I don't want her to get accustomed to me entertaining her each day, so gave her the option of playing after doing her homework, or helping me clean the fridge after I had finished writing. She played while I wrote and then she helped wash the shelves of the fridge in the sink, whilst I was sorting through and reorganising contents.
We had a good discussion during the is time about learning and how everything is essentially patterns, where some kind of series leads to an outcome. She doesn't quite understand the concept of mathematical formulas enough, so the pattern analogy worked. So, for example, when she reads, she first learns the possible letters in the alphabet and then how they work together. From there, each word is just a pattern made from those pieces. Similarly, we talked about music, how each song is made from the same notes, but depending on the timing and order, the pattern changes, and a different song is made.
We looked at some other examples, like building a house, where the foundation part of the pattern is like an alphabet, where it is required to build upon. Or the patterns of using cutlery correctly to eat, where once the patterns are practiced, they become habitual and thoughtless. All of what we do is *formulaic.* Even if we don't know or understand the formula being used.
> A little later, this conversation came to mind.


> This isn't for the benefits of the users.
However, I think that a place like Facebook (I am not a user) shouldn't be the guardians of truth for two billion people. And, we have seen how many times that in hindsight, the social platforms got it very wrong, because "fact" was whatever the government decided was fact, and fallacy was anything the government disagreed with.
But, the stated reason for getting rid of the fact checkers is that they are biased, and there are an increasing number of errors. This is caused by many reasons, which will also include the explosion of AI content and AI automation for the checking itself, as the pool of information on what is fact and fiction, is increasingly muddy. Not only this, regardless of whether something is fact or not, we live in an age where the truth is whatever you believe it to be, to the point where a man can beat a woman up, and win a medal for it, *as a woman.*
The thing is that just like gender, fact and fiction are not binary these days, they are on a spectrum, and truth is dictated by perspective *and opinion.* Not only this, information is no longer taken as living in an ecosystem with interdependencies and nuances. Rather, each individual sees that there is a black and white right and wrong and *expects* that their personal viewpoint on every topic, is respected wherever they go and in whatever they consume, even if they are not the target market of the content.
>As you can imagine, this is a nightmare for a platform to moderate for acceptable facts.
But, by giving up fact-checking and pushing it to the community, they can "wash their hands" of the liability, and can instead push blame and due diligence to the users. That sounds like a good thing and maybe it is, but as we know, users are overconfident and lazy idiots, who are driven by desire and greed - *because we are human.* We aren't going to do our own fact checking, and they know that. This means their volume of created content can go up unverified, cost of policing go down, and their ad revenue model make more - maximizing profits a bit further.
And of course, another reason to no longer censor, is to provide a platform for any kind of political discourse, which again sounds good, but actually just empowers the outspoken minority to get increasingly more vocal, and vitriolic. Leaving it up to the community to moderate, makes it a place for fighting and point scoring, not discussion and debate.
> There are patterns in play.
The part of the formula we need to remember when it comes to corporate platforms, is that every decision they make is in service to increasing or defending profits. If the best way to do this is also in the best interest of the user, they don't mind, but it *so rarely is.* All this move is, is more political manoeuvring to take advantage of the changing information landscape, where "fake news" is *all the news,* and fact and truth is no longer verifiable. Facebook don't want to police it, because it is a winless situation in modern society. We say we want the truth, but even if we do, that is no guarantee we will accept it as such anyway.
I don't care what the centralised platforms do with their content, but what it will do is further distort information and erode trust, which means that at some point, people might actually be willing to build and use systems they *can trust.* No centralised information can be trusted, because there is just too much incentive to skew the message, especially when the algorithms doing the checks and distributing those messages, are designed to maximise profit.
Eventually, my daughter is likely to end up at least trying many of these platforms, but I am not going to send her in blind. Instead, she is going to have the foundation to read and understand the patterns she is seeing, and at least *have a chance* to consume with discernment, rather than naïveté. One thing we need to recognise is, that there is far too much we don't know, to be too confident in what we think we do know.
That is a fact.
Taraz
[ Gen1: Hive ]
That's Fact Up
@tarazkp
· 2025-01-07 23:23
· Reflections
#philosophy
#psychology
#mindset
#family
#health
#reflect
#society
#information
#truth
Payout: 0.000 HBD
Votes: 322
More interactions (upvote, reblog, reply) coming soon.