[ home / overboard ] [ soy / qa / raid / r ] [ ss / craft ] [ int / pol ] [ a / an / asp / biz / mtv / r9k / tech / v / sude / x ] [ q / news / chive / rules / pass / bans / status ] [ wiki / booru / irc ]

A banner for soyjak.party

/pol/ - International /Pol/itics & /Bant/er

18+ | Politics & countrywars
Catalog
Name
Email
Subject
Comment
Flag
File
Password (For file deletion.)

File: 129892 - SoyBooru.png πŸ“₯︎ (277.16 KB, 2182x2748) ImgOps

 β„–2714056[Quote]

Erm /pol/ I've been working on something and I want your input before I refine it further (because you're MY personal army or something)
<reddit space
So we all know news is fake and gay, but the question is HOW fake? Can we actually measure the degradation? I think yes. My theory is that every time information passes through a medium (like from the raw truth of the event to the first reporting on it to the secondary reporting based upon the first reporting to word of mouth etc) it loses fidelity. This itself is proven already in datascience. You cannot create information through processing it, by the nature of what information is you can only preserve or destroy it. Anyway so I'm proposing we measure this loss rate in a unit I call Rands, which basically measure transmission fidelity on a scale. The scale peaks at infinity Rands which is Perfect unmediated perception, which is impossible, as you'd have to literally BE the thing. Anything above 1 Rand is impossible. 1.0 Rand is lossless transmission, so like a perfect mirror or a direct sensor reading. 0.5 Rands means the medium sheds half the information per step. And then 0 Rands means what's being communicated is just total noise, none of the original signal remains whatsoever (Heckin' dramaslop and mainstream media approaches this).
<4caca space
To calculate Rands you basically just identify the bits of discrete information in the original source. You count how many survive mediation. The preserved-information/original-information = the Rand value.
So the more of the original information is lost the lower the Rands. The more of the original information is preserved the higher the rands. At probably like 0.05 Rands you hit what I call the Semantic Event Horizon which is generally the point where the news is about the news itself instead of the original event/raw truth/unmediated reality. It essentially becomes mediation about mediation, representations of representations of representations.
<philosophy space
Anyway the reason why I think the measurement of this loss is useful is because if you measure loss between two mediation layers, you can extrapolate BACKWARDS to estimate how much of the original unmediated event is lost, even if you never saw it. Like measuring radioactive decay to date something. You can then kinda engage in forensic reconstruction of actual events from propaganda. It's a way of quantifying the telephone game effect so to speak.
<mediated space
Anyways thoughts or something? I'm kind of excited about this and think it's philosophically relevant and useful or something.

 β„–2714072[Quote]

Interesting

 β„–2714084[Quote]

>>2714056 (OP)
Long winded way of saying primary sources are ideal.

 β„–2714103[Quote]

File for a patent on this before some cuck from Harvard steals it and takes full credit

 β„–2714105[Quote]

>>2714084
It's moreso to say that we can determine from only degraded data how severely the data we have has been degraded. You can figure out the delta between the original unmediated information and the information currently available after however many levels of mediation.

 β„–2714109[Quote]

>>2714103
I don't think you can patent methods or formulas albeit.

 β„–2714116[Quote]

"play the telephone game" essentially
And youre describing an immeasurable percentage, so best to put it on a scale of integers (e.g five levels of truthfulness)
<
"Trump killed a million niggers" - above 0 Rand because it assumes that Trump is real and there were at least 1 million living niggers

 β„–2714118[Quote]

>>2714103
Ev&doe Patents are the brimstone that's driving up healthcare costs.

 β„–2714124[Quote]

File: ThinkerGigger.png πŸ“₯︎ (190.61 KB, 500x647) ImgOps

Wojack party spammers are more sentient than the majority of Harvard students

 β„–2714129[Quote]

>>2714124
Snopes says this is true.

 β„–2714143[Quote]

>>2714116
I mean it depends what the original unmediated information is ultimately. The Rand is essentially supposed to be a measurement of efficiency of transmission at the most basic level. If the original information was "Trump killed a million niggers" then it would be a transmission of 1 Rand. But if the original message is supposed to be about like what actually is happening in AmeriKKKa or something then obviously it would have very few Rands, like 0.001 or something. Rands don't measure truthfulness of a message, just like the total loss of the original unmediated information in comparison to mediated information.

 β„–2714146[Quote]

File: SoyBooru.com - 126950 - ar….png πŸ“₯︎ (810.08 KB, 1174x1474) ImgOps

File: SoyBooru.com - 126949 - ar….png πŸ“₯︎ (1.06 MB, 1174x1474) ImgOps

File: SoyBooru.com - 126948 - ar….png πŸ“₯︎ (892.89 KB, 1174x1474) ImgOps

File: SoyBooru.com - 126947 - ac….png πŸ“₯︎ (860.68 KB, 1174x1474) ImgOps


 β„–2714169[Quote]

>>2714143
you still cant make a continuous or scientifically rigorous measurement of that as it pertains to semantics

 β„–2714176[Quote]

interesting perspective, but lets say there are 3 steps for "event a" to reach you, you cannot figure out how much of the information is missing just by looking at the loss between step 2 and 3, because while that step may have 0.9 rands, the event a to step 1 may have had 0.8 rands, and step 1 to step 2 may have had 0.1 rands, you cant really tell
Also shouldnt the scale peak at 1 rand?

 β„–2714179[Quote]

>>2714169
It's possible, just very difficult. There is such a thing as bits of information in speech. It's just very hard to measure. This is a part of the theory I need help with frankly.

 β„–2714183[Quote]

>>2714116
2/3rds of a rand
trump :tick:
million niggers :tick:
killed falsenvke

 β„–2714185[Quote]

i dont think measuring delta rand (total amount of rand from the event itself to the final source) is possible unless you know every single independent source's rand, because all sources will have independent, non-same rand values

 β„–2714202[Quote]

"Accurate" is vague. What information do you want to measure with this?

 β„–2714211[Quote]

>>2714176
It does peak at 1 Rand due to basic ontological realities. You're also correct that there will be some level of inaccuracy in determination of the amount of lost unmediated information. If information is lost, we cannot recover it through any algorithm, and this includes information on how much information was lost. It will not be extraordinarily precise but I think it will serve to give a very good estimate of how much information has been lost after a number of layers of mediation. You will not be able to determine with 100% certainty that, for example, an original raw image was 712mb, but you can reach that estimate and rely on it being approximate to reality.
>>2714185
The Rand itself measures the delta. No singular source has any value in Rands. Rands are a measurement of loss between two sources.
>>2714202
Loss of information in transmission. For example, in a political context, the loss of information from the unmediated original event to what is reported in the news.

 β„–2714264[Quote]

I can kind of see this working, but it would be pretty objective for what you count as important info and what doesn't. Example of how I think it could work:

^News report on a car chase:

>Car is a Ford (check)
>Car is blue (check)
>Suspect name is Darius (check)
<Suspect is black (left out)
^Result: 0.75 rands

Maybe I'm thinking about it all wrong. Correct me if that is the case, OP.

 β„–2714272[Quote]

File: ClipboardImage.png πŸ“₯︎ (66.7 KB, 742x743) ImgOps

>>2714211
what im trying to say is, you can not assume the rand for all transmission by only knowing 1 rand value, since they can change

 β„–2714275[Quote]

>>2714056 (OP)
Useless.

 β„–2714290[Quote]

what i am trying to say is, this is an interesting theory but you cant calculate the total rand from the event itself to the final source by only knowing one rand inside one of the steps, unless you know every single step has the same rand (which will never happen)

 β„–2714292[Quote]

>>2714179
How is this for a measurement of bits in semantic communication? I call it the prototaxis. I begin with a few principles. Firstly, for a unit of information to exist, it must refer to a discrete entity with specific properties. If a statement doesn't identify a thing, it contains 0 bits, naturally. Concepts are measurements omitted. So a Bit of truth is the successful identification of a referent in reality. Additionally, truth claims are only valid if they don't contradict the physical reality of their conveyance. This is the famed Axiom of Action. So information must be tied to verifiable propositions regarding physical entities and their movements in spacetime. So, then, to discretize speech, we must break it down into the smallest possible packet that satisfies these three constraints I think. We will call this unit a prototaxis (protos for first and taxis for arrangement). For a sentence to contain 1 Prototax, it must establish a Subject-Predicate-Referent chain that can be verified as a physical interaction. For example "The man hit the ball." Or "The politician spoke." These are both 1 prototaxis because they identify discrete entities and a physical change of state. "The man is bad" is 0 prototaxis because "bad" has no fixed physical identit and describes no praxeological action. So essentially the prototaxis is the smallest discrete propositional statement that identifies a specific entity and ascribes to it a non-contradictory action or attribute verifiable within a physical causal chain.

 β„–2714306[Quote]

>>2714292
There are bits of information that depend on local coordinates therefore rendering your whole paragraph useless.

 β„–2714307[Quote]

>>2714056 (OP)
Sounds cool, but this doesn't really work when it comes to subversive language, that although true still hints at an agenda, I'm too lazy to write an example but if you don't know what I'm talking about then I can, to simplify it's just pedantry.

 β„–2714312[Quote]

>>2714272
I do not assume that the Rands between stages of transmission are uniform. The idea is to use known datapoints to extrapolate to the unknown. Obviously, you cannot know the unknown. You cannot know for certainty the Rands between the original unmediated information and the later mediated information without knowing the unmediated information. But you can extrapolate between two layers of mediated information to estimate roughly how much has been lost from the original unmediated information. Does this make sense?

 β„–2714319[Quote]

>>2714312
So there is no absolute scale only the measures of difference in the amount of original information?

 β„–2714335[Quote]

>>2714319
Correct. A Rand is basically the inverse of the rate of informational decay in transmission.

 β„–2714350[Quote]

>>2714335
Okay I will ask a series of question to show that the scale is useless.
How does one assign a rand of unit, if only the difference between the amount of rands can be measured?

 β„–2714361[Quote]

This already kind of exists, its called the Shannon/Sh and it could be rephrased in a way to quantify how information is manipulated I think
https://en.wikipedia.org/wiki/Shannon_(unit)

 β„–2714395[Quote]

>>2714361
>it can be rephrases
No the shannon uses the possible evolution from a particular state of information of an event with a probability of 50% and its enteopic evolution. This has almost nothing to do with this.

 β„–2714402[Quote]

>>2714211
I know, but keeping all of the information is useless. You must exclude everything that you don't want to measure or else the results will be wrong

 β„–2714410[Quote]

>>2714402
There should be specific rand values instead of one vague value

 β„–2714425[Quote]

>>2714395
not on a small scale, but on a larger scale you could measure entropy of vocabulary and reporting frequency on a topic to discuss the potential for massive manipulation of reporting on a topic, or to see how much the narrative surrounding current events has been compressed or expanded beyond its actual scope

 β„–2714460[Quote]

>>2714425
Who defines what up and down is on the scale ie which news have more rands than others, because of course there is information loss, but you want to meqsure how manipulative media, the problem is media adds information that is not true, therefore without knowing what originally happend or assigning arbitrary values you cant make a scale.

 β„–2714822[Quote]

>forensic reconstruction of actual events from propaganda
many people do this already when researching events & history, but i feel it is too abstract of a process to build some sort of accurate numbering system,
to derive a consistent value from the murkiness of semantics. sounds very hard
other than that great post



[Return][Catalog][Go to top][Post a Reply]
Delete Post [ ]
[ home / overboard ] [ soy / qa / raid / r ] [ ss / craft ] [ int / pol ] [ a / an / asp / biz / mtv / r9k / tech / v / sude / x ] [ q / news / chive / rules / pass / bans / status ] [ wiki / booru / irc ]