I know the mod devloper and he indeed took inspiration from the GTA 5 chaos mod for the hoi4 chaos mod, as he's a fan of DarkViperAU who did and still do a ton of episode with the chaos mod.
16:00 actually Menderes mostly wears sunglas irl. What a lost opportunity they didn't choose one of his meme poses with his sunglas. Maybe eyebrows was the reason huh
''Fritz Fisher. Born into a tumultuous era, Fritz emerged as a charismatic leader in Germany during a period of political instability. Fritz Fisher's early life was shrouded in mystery, with conflicting accounts of his upbringing. Some claimed he was an orphan, while others insisted he hailed from a prestigious family with a long lineage. Regardless of his origins, Fritz possessed a natural gift for oratory and leadership, capturing the hearts and minds of those around him. During a time of economic turmoil and social unrest in Germany, Fritz Fisher rose to prominence with a message of hope and change. He skillfully navigated the political landscape, forming alliances with key figures in the military and industry. His populist rhetoric resonated with a disillusioned population, promising a brighter future and a return to greatness for the nation. However, as Fritz consolidated his power, whispers of a darker side began to circulate. Some claimed that behind the charismatic facade, he ruled with an iron fist, suppressing dissent and eliminating rivals. The once-promised utopia began to show cracks as authoritarian measures were implemented, and opposition voices were silenced. As tensions escalated both within Germany and on the international stage, Fritz Fisher's rule grew increasingly erratic and paranoid. Fearing internal conspiracies and external threats, he tightened his grip on power, leading the nation down a dangerous path. Eventually, his actions caught up with him, and a clandestine group of dissenters orchestrated his downfall. In a dramatic turn of events, Fritz Fisher was murdered under mysterious circumstances. The details surrounding his death remained a closely guarded secret, with conflicting accounts and conspiracy theories circulating for years. The nation he once led was left in a state of uncertainty, grappling with the legacy of a charismatic leader turned tyrant. The brief reign of Fritz Fisher became a cautionary tale, a dark chapter in the alternate history of Germany. His legacy served as a reminder of the fragility of power and the dangers of unchecked ambition. The world watched as Germany, scarred by the tumultuous episode, struggled to rebuild and redefine itself in the aftermath of Fritz Fisher's rise and fall.'' :)))))
From Wikipedia, the free online encyclopedia Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer model of deep neural network, which supersedes recurrence and convolution-based architectures with a technique known as "attention". This attention mechanism allows the model to selectively focus on segments of input text it predicts to be most relevant. GPT-3 has 175 billion parameters, each with a 16-bit precision, thus requiring 350GB of storage space as each parameter takes 2 bytes of space. It has a context window size of 2048 tokens, and has demonstrated strong "zero-shot" and "few-shot" learning abilities on many tasks. On September 22, 2020, Microsoft announced that it had licensed GPT-3 exclusively. Others can still receive output from its public API, but only Microsoft has access to the underlying model. According to The Economist, improved algorithms, more powerful computers, and a recent increase in the amount of digitized material have fueled a revolution in machine learning. New techniques in the 2010s resulted in "rapid improvements in tasks”, including manipulating language. Software models are trained to learn by using thousands or millions of examples in a "structure ... loosely based on the neural architecture of the brain". One architecture used in natural language processing (NLP) is a neural network based on a deep learning model that was introduced in 2017-the transformer architecture. There are a number of NLP systems capable of processing, mining, organizing, connecting and contrasting textual input, as well as correctly answering questions. On June 11, 2018, OpenAI researchers and engineers published a paper introducing the first generative pre-trained transformer (GPT)-a type of generative large language model that is pre-trained with an enormous and diverse text corpus in datasets, followed by discriminative fine-tuning to focus on a specific task. GPT models are transformer-based deep-learning neural network architectures. Previously, the best-performing neural NLP models commonly employed supervised learning from large amounts of manually-labeled data, which made it prohibitively expensive and time-consuming to train extremely large language models. The first GPT model was known as "GPT-1," and it was followed by "GPT-2" in February 2019. Created as a direct scale-up of its predecessor, GPT-2 had both its parameter count and dataset size increased by a factor of 10. It had 1.5 billion parameters, and was trained on a dataset of 8 million web pages. In February 2020, Microsoft introduced its Turing Natural Language Generation (T-NLG), which they claimed was "largest language model ever published at 17 billion parameters." It performed better than any other language model at a variety of tasks, including summarizing texts and answering questions. On May 28, 2020, an arXiv preprint by a group of 31 engineers and researchers at OpenAI described the achievement and development of GPT-3, a third-generation "state-of-the-art language model". The team increased the capacity of GPT-3 by over two orders of magnitude from that of its predecessor, GPT-2,[13] making GPT-3 the largest non-sparse language model to date. Because GPT-3 is structurally similar to its predecessors, its greater accuracy is attributed to its increased capacity and greater number of parameters. GPT-3's capacity is ten times larger than that of Microsoft's Turing NLG, the next largest NLP model known at the time. Lambdalabs estimated a hypothetical cost of around $4.6 million US dollars and 355 years to train GPT-3 on a single GPU in 2020, with lower actual training time by using more GPUs in parallel. Sixty percent of the weighted pre-training dataset for GPT-3 comes from a filtered version of Common Crawl consisting of 410 billion byte-pair-encoded tokens. 9 Other sources are 19 billion tokens from WebText2 representing 22% of the weighted total, 12 billion tokens from Books1 representing 8%, 55 billion tokens from Books2 representing 8%, and 3 billion tokens from Wikipedia representing 3%. 9 GPT-3 was trained on hundreds of billions of words and is also capable of coding in CSS, JSX, and Python, among others. Since GPT-3's training data was all-encompassing, it does not require further training for distinct language tasks. The training data contains occasional toxic language and GPT-3 occasionally generates toxic language as a result of mimicking its training data. A study from the University of Washington found that GPT-3 produced toxic language at a toxicity level comparable to the similar natural language processing models of GPT-2 and CTRL. OpenAI has implemented several strategies to limit the amount of toxic language generated by GPT-3. As a result, GPT-3 produced less toxic language compared to its predecessor model, GPT-1, although it produced both more generations and a higher toxicity of toxic language compared to CTRL Wiki, a language model trained entirely on Wikipedia data. On June 11, 2020, OpenAI announced that users could request access to its user-friendly GPT-3 API-a "machine learning toolset"-to help OpenAI "explore the strengths and limits" of this new technology. The invitation described how this API had a general-purpose "text in, text out" interface that can complete almost "any English language task", instead of the usual single use-case. According to one user, who had access to a private early release of the OpenAI GPT-3 API, GPT-3 was "eerily good" at writing "amazingly coherent text" with only a few simple prompts. In an initial experiment 80 US subjects were asked to judge if short ~200 word articles were written by humans or GPT-3. The participants judged correctly 52% of the time, doing only slightly better than random guessing. On November 18, 2021, OpenAI announced that enough safeguards had been implemented that access to its API would be unrestricted. OpenAI provided developers with a content moderation tool that helps them abide by OpenAI's content policy. On January 27, 2022, OpenAI announced that its newest GPT-3 language models (collectively referred to as InstructGPT) were now the default language model used on their API. According to OpenAI, InstructGPT produced content that was better aligned to user intentions by following instructions better, generating fewer made-up facts, and producing somewhat less toxic content. Because GPT-3 can "generate news articles which human evaluators have difficulty distinguishing from articles written by humans," GPT-3 has the "potential to advance both the beneficial and harmful applications of language models." In their May 28, 2020 paper, the researchers described in detail the potential "harmful effects of GPT-3" which include "misinformation, spam, phishing, abuse of legal and governmental processes, fraudulent academic essay writing and social engineering pretexting". The authors draw attention to these dangers to call for research on risk mitigation. GPT-3 is capable of performing zero-shot and few-shot learning (including one-shot). In June 2022, Almira Osmanovic Thunström wrote that GPT-3 was the primary author on an article on itself, that they had submitted it for publication, and that it had been pre-published while waiting for completion of its review.
@@Warcriminal843 It is a Wikipedia article about the program used to create the comment this reply thread is attached to. It is best known as ChatGPT. The original poster is (presumably) talentless and likely seeking attention with their incisive "writing skills" (they [seemingly] only have enough to write a prompt into ChatGPT), or is tasteless and believes throwing prompts into an AI and copy pasting what comes out is "humorous." You can never be sure, however, and these are just preconceptions.
small tip for 5:56 you should always spare the tactical bomber production cuz you dont produce any for 2 reasons, cost and they just suck and then its free -consumer goods
It's okay if you cheated, you literally made the challenge harder by installing the mod in the first place. Well done? I'm more concerned about your sanity at this point lol, but this video was indeed very entertaining
I love how the video is in English, and when I click the link, the screenshots are in French and the text is very broken German. Considering that the mod is called "Mod Chaos Extended", I'm not even sure if that's intended or not, but it's rather funny haha
23:45 British Empire Advisor: oh no, Germany is halfway to Scotland and we're going to die! What should we do? Mosley: Give them Austria. Advisor: what?! Mosley: did I stutter?
you should relax for a while man, play some wholsome playthroughs, no war, better living conditions, just deal with the economy, that ain that hard, especialy when you are playing as sony's guangdong!
I love how my favourite Hoi4 youtubers are bitter steel, who always play's iranman mode and historical Ai, and you who alwas load's 10000 saves, is cheating and this in the wierdest world ever.
I watched this all the way through, and it is so depressing hearing his voice transform from an average gameplay video, to a veteran who actually fought in all of these wars.
American here. Biloxi didn’t actually exist, it’s all just superstition (I live in Maine so I definitely know what I’m talking about in regards to Louisiana)
Griefer Jesus is a reference to the GTA 5 chaos mod, where a dude dressed as Jesus kills you with either a railgun or a flying bike
Our god is an awesome god.
I know the mod devloper and he indeed took inspiration from the GTA 5 chaos mod for the hoi4 chaos mod, as he's a fan of DarkViperAU who did and still do a ton of episode with the chaos mod.
@@legaullonapoleonien8760 Yep I watch Matt aswell
@@hakonhugiandersen1740I watch Nick too!
@@legaullonapoleonien8760 he shaved his face 😭😭😭
At this point I really believe that he loves suffering
What gave that away
@@_HollowBeing_ I don't know,maybe the whole formable nations series 💀
he has clinical depresion
New day new suffering
Every day is suffering
that do be what buddha said
As someone who has lived in Biloxi, I was surprised too.
lies. no one lives there
biloxi bros
@@Podzhagitel yes, no one _lives_ there, but people _lived_ there, until bigkahuna_08 left, after which it collapsed.
Lies you tell lies i say
This was painful to watch, I can't even imagine how much suffering this man has gone through.
16:00 actually Menderes mostly wears sunglas irl. What a lost opportunity they didn't choose one of his meme poses with his sunglas. Maybe eyebrows was the reason huh
His sunglasses weren’t able to seal the eyebrows
@@_HollowBeing_His eyebrows weren't the only reason we hung him
this dude has the most passive aggressive meltdowns I’ve ever seen
I live in Biloxi trust me
Biloxi is a ploy by the US government to piss me off, it ain’t real
Wear tight pnts for me girl
@@John-li6sk haha
@@_HollowBeing_ Its an actual real city in Mississipi
Thanks for playing the Chaos mod, I love the Chaos mod. Yes the mod devloper is French and I know him @cobrathibok !
13:11 holy shit we are synchronised, i just sent the "Beating You With A Frying Pan ASMR" to my friend like a few hours ago
We all watched it innit ?
It’s truly an experience
@@NXT_GEN_LETSPLAY probably
It's definitely fun to listen to
''Fritz Fisher. Born into a tumultuous era, Fritz emerged as a charismatic leader in Germany during a period of political instability. Fritz Fisher's early life was shrouded in mystery, with conflicting accounts of his upbringing. Some claimed he was an orphan, while others insisted he hailed from a prestigious family with a long lineage. Regardless of his origins, Fritz possessed a natural gift for oratory and leadership, capturing the hearts and minds of those around him. During a time of economic turmoil and social unrest in Germany, Fritz Fisher rose to prominence with a message of hope and change. He skillfully navigated the political landscape, forming alliances with key figures in the military and industry. His populist rhetoric resonated with a disillusioned population, promising a brighter future and a return to greatness for the nation. However, as Fritz consolidated his power, whispers of a darker side began to circulate. Some claimed that behind the charismatic facade, he ruled with an iron fist, suppressing dissent and eliminating rivals. The once-promised utopia began to show cracks as authoritarian measures were implemented, and opposition voices were silenced. As tensions escalated both within Germany and on the international stage, Fritz Fisher's rule grew increasingly erratic and paranoid. Fearing internal conspiracies and external threats, he tightened his grip on power, leading the nation down a dangerous path. Eventually, his actions caught up with him, and a clandestine group of dissenters orchestrated his downfall. In a dramatic turn of events, Fritz Fisher was murdered under mysterious circumstances. The details surrounding his death remained a closely guarded secret, with conflicting accounts and conspiracy theories circulating for years. The nation he once led was left in a state of uncertainty, grappling with the legacy of a charismatic leader turned tyrant. The brief reign of Fritz Fisher became a cautionary tale, a dark chapter in the alternate history of Germany. His legacy served as a reminder of the fragility of power and the dangers of unchecked ambition. The world watched as Germany, scarred by the tumultuous episode, struggled to rebuild and redefine itself in the aftermath of Fritz Fisher's rise and fall.'' :)))))
From Wikipedia, the free online encyclopedia
Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020.
Like its predecessor, GPT-2, it is a decoder-only transformer model of deep neural network, which supersedes recurrence and convolution-based architectures with a technique known as "attention". This attention mechanism allows the model to selectively focus on segments of input text it predicts to be most relevant. GPT-3 has 175 billion parameters, each with a 16-bit precision, thus requiring 350GB of storage space as each parameter takes 2 bytes of space. It has a context window size of 2048 tokens, and has demonstrated strong "zero-shot" and "few-shot" learning abilities on many tasks.
On September 22, 2020, Microsoft announced that it had licensed GPT-3 exclusively. Others can still receive output from its public API, but only Microsoft has access to the underlying model.
According to The Economist, improved algorithms, more powerful computers, and a recent increase in the amount of digitized material have fueled a revolution in machine learning. New techniques in the 2010s resulted in "rapid improvements in tasks”, including manipulating language.
Software models are trained to learn by using thousands or millions of examples in a "structure ... loosely based on the neural architecture of the brain". One architecture used in natural language processing (NLP) is a neural network based on a deep learning model that was introduced in 2017-the transformer architecture. There are a number of NLP systems capable of processing, mining, organizing, connecting and contrasting textual input, as well as correctly answering questions.
On June 11, 2018, OpenAI researchers and engineers published a paper introducing the first generative pre-trained transformer (GPT)-a type of generative large language model that is pre-trained with an enormous and diverse text corpus in datasets, followed by discriminative fine-tuning to focus on a specific task. GPT models are transformer-based deep-learning neural network architectures. Previously, the best-performing neural NLP models commonly employed supervised learning from large amounts of manually-labeled data, which made it prohibitively expensive and time-consuming to train extremely large language models. The first GPT model was known as "GPT-1," and it was followed by "GPT-2" in February 2019. Created as a direct scale-up of its predecessor, GPT-2 had both its parameter count and dataset size increased by a factor of 10. It had 1.5 billion parameters, and was trained on a dataset of 8 million web pages.
In February 2020, Microsoft introduced its Turing Natural Language Generation (T-NLG), which they claimed was "largest language model ever published at 17 billion parameters." It performed better than any other language model at a variety of tasks, including summarizing texts and answering questions.
On May 28, 2020, an arXiv preprint by a group of 31 engineers and researchers at OpenAI described the achievement and development of GPT-3, a third-generation "state-of-the-art language model". The team increased the capacity of GPT-3 by over two orders of magnitude from that of its predecessor, GPT-2,[13] making GPT-3 the largest non-sparse language model to date. Because GPT-3 is structurally similar to its predecessors, its greater accuracy is attributed to its increased capacity and greater number of parameters. GPT-3's capacity is ten times larger than that of Microsoft's Turing NLG, the next largest NLP model known at the time.
Lambdalabs estimated a hypothetical cost of around $4.6 million US dollars and 355 years to train GPT-3 on a single GPU in 2020, with lower actual training time by using more GPUs in parallel.
Sixty percent of the weighted pre-training dataset for GPT-3 comes from a filtered version of Common Crawl consisting of 410 billion byte-pair-encoded tokens. 9 Other sources are 19 billion tokens from WebText2 representing 22% of the weighted total, 12 billion tokens from Books1 representing 8%, 55 billion tokens from Books2 representing 8%, and 3 billion tokens from Wikipedia representing 3%. 9 GPT-3 was trained on hundreds of billions of words and is also capable of coding in CSS, JSX, and Python, among others.
Since GPT-3's training data was all-encompassing, it does not require further training for distinct language tasks. The training data contains occasional toxic language and GPT-3 occasionally generates toxic language as a result of mimicking its training data. A study from the University of Washington found that GPT-3 produced toxic language at a toxicity level comparable to the similar natural language processing models of GPT-2 and CTRL. OpenAI has implemented several strategies to limit the amount of toxic language generated by GPT-3. As a result, GPT-3 produced less toxic language compared to its predecessor model, GPT-1, although it produced both more generations and a higher toxicity of toxic language compared to CTRL Wiki, a language model trained entirely on Wikipedia data.
On June 11, 2020, OpenAI announced that users could request access to its user-friendly GPT-3 API-a "machine learning toolset"-to help OpenAI "explore the strengths and limits" of this new technology. The invitation described how this API had a general-purpose "text in, text out" interface that can complete almost "any English language task", instead of the usual single use-case. According to one user, who had access to a private early release of the OpenAI GPT-3 API, GPT-3 was "eerily good" at writing "amazingly coherent text" with only a few simple prompts. In an initial experiment 80 US subjects were asked to judge if short ~200 word articles were written by humans or GPT-3. The participants judged correctly 52% of the time, doing only slightly better than random guessing.
On November 18, 2021, OpenAI announced that enough safeguards had been implemented that access to its API would be unrestricted. OpenAI provided developers with a content moderation tool that helps them abide by OpenAI's content policy. On January 27, 2022, OpenAI announced that its newest GPT-3 language models (collectively referred to as InstructGPT) were now the default language model used on their API. According to OpenAI, InstructGPT produced content that was better aligned to user intentions by following instructions better, generating fewer made-up facts, and producing somewhat less toxic content.
Because GPT-3 can "generate news articles which human evaluators have difficulty distinguishing from articles written by humans," GPT-3 has the "potential to advance both the beneficial and harmful applications of language models." In their May 28, 2020 paper, the researchers described in detail the potential "harmful effects of GPT-3" which include "misinformation, spam, phishing, abuse of legal and governmental processes, fraudulent academic essay writing and social engineering pretexting". The authors draw attention to these dangers to call for research on risk mitigation.
GPT-3 is capable of performing zero-shot and few-shot learning (including one-shot).
In June 2022, Almira Osmanovic Thunström wrote that GPT-3 was the primary author on an article on itself, that they had submitted it for publication, and that it had been pre-published while waiting for completion of its review.
@@randomtexanguy9563god damn(am also from Texas) that was long
@@Warcriminal843 It is a Wikipedia article about the program used to create the comment this reply thread is attached to. It is best known as ChatGPT. The original poster is (presumably) talentless and likely seeking attention with their incisive "writing skills" (they [seemingly] only have enough to write a prompt into ChatGPT), or is tasteless and believes throwing prompts into an AI and copy pasting what comes out is "humorous." You can never be sure, however, and these are just preconceptions.
ai nerd...
@@Warcriminal843everything is bigger in Texas
This has got to be one of the greatest most chaotic HOI vids i've ever seen.. Love your content man
Anyone ANYONE would just play normally, you want to suffer, that's why I love your content!
Fritz Fischer is actually the protagonist of a famous german tongue-twister:
Fischer Fritz fischt frische Fische. Frische Fische fischt Fischer Fritz.
Bro is him
Hollow being embodies *that* stage of suffering we've all felt while playing paradox games better than any other youtuber
small tip for 5:56 you should always spare the tactical bomber production cuz you dont produce any for 2 reasons, cost and they just suck and then its free -consumer goods
21:41 holy shit the pronunciation was C R I S P
Ngl just discovered this channel, probably my new favourite brand of insanity
It's okay if you cheated, you literally made the challenge harder by installing the mod in the first place. Well done? I'm more concerned about your sanity at this point lol, but this video was indeed very entertaining
I… have been forgiven…
@@_HollowBeing_ we have to cover up Uganda somehow ¯\_(ツ)_/¯
we do NOT bring up the uganda incident
What Uganda incident?
Greifer jesus gta5 reference
Fellow DviperAU viewer
@@ozark7834THIS IS MILLIONS TO ONE!
Awful but thank you
Who knew that so many Matto enjoyers also enjoyed Hollow
@@ozark7834 who is darkviperau
"Ahh Fuck it's so big!" -HollowBeing 2024
Banned.
You didn't make it sound $exual enough
Love the idea of formable A-to-Z, wish I'd thought of it :D Kickass mod idea as well, definitely wanna play this
Yes… spread the pain that is this mod
“I don’t know who the President in 2008 was” Gen alpha confirmed. Hollow is playing this on an iPad.
34:23 That was genually the worst pronunciation of “izquierdista” I’ve ever heard. Good vid bro
3:55 ,,Konrad AdenOVER" was RIGHT THERE.
Konrad BADenauer
This is the most chaotic and ridiculous hoi4 vidoe I've ever seen, love it
I love how the video is in English, and when I click the link, the screenshots are in French and the text is very broken German.
Considering that the mod is called "Mod Chaos Extended", I'm not even sure if that's intended or not, but it's rather funny haha
This is good hoi4 channel, im happy to have found this
23:45
British Empire Advisor: oh no, Germany is halfway to Scotland and we're going to die! What should we do?
Mosley: Give them Austria.
Advisor: what?!
Mosley: did I stutter?
This was probably the most entertaining hoi4 video I have ever seen
just wait until he has to play all branches of monarchism in the french tree
18:33 thats the trait for the advisor named juan negrín for the republicans, he was the leader of the carabiners. (Great Carabiner, Gran Carabinero)
"oh the misry, everybody wants to be my enemy" -HollowBeing
"those two things don't exist. What the f--k is a Belgium"
***bittersteel wants to know your location***
It literally isn’t real
13:11 such a relaxing video tho
You know you can have a good time right, you should've played normaly saving some braincells
…A good time… in hoi4??
Man, I really felt his tired voice at the remaining end of the video. *Pain*
The British Empire in the first playthrough turning Copenhagen into Maldenhagen.
you should relax for a while man, play some wholsome playthroughs, no war, better living conditions, just deal with the economy, that ain that hard, especialy when you are playing as sony's guangdong!
That sounds a lot like paradox video game Victoria 3 (2021)
@@_HollowBeing_ i was talkning aboult tno, bruh
@@elif_coşkun_77 Have you lost your mind?
@@InfiniteDeckhand thats a funny accusation coming from a mf with a anime pfp who watches vtubers
When you're winning too much so the mod decides to put you into Downfall simulator lmao
13:15 Oh look extremely rare event about Charlie Chaplin.
I love how long it takes to make these. Let's me know how long the suffering went on for
This video is the definition of a HOI4 Nightmare Blunt rotation.
bro that man with noticeable eyebrows was hanged after an army coup
lets see how many brain cells i will loose just by watching
HollowBeing completely failing at pronouncing german words is funny
I think you had very much fun while recording that video
I love how my favourite Hoi4 youtubers are bitter steel, who always play's iranman mode and historical Ai, and you who alwas load's 10000 saves, is cheating and this in the wierdest world ever.
3:55 Konrad GONEaur was right there.
"Thanks for 12k"
I was hoping we'd get to 13k before this...
Hey there, comrade
12:18 I’ve been there, I can confirm there is nothing there
I'm starting to think he doesn't know how peace deals work...
Bush. The president was Bush.
It is here finnaly...
I watched this all the way through, and it is so depressing hearing his voice transform from an average gameplay video, to a veteran who actually fought in all of these wars.
Man finally has an easy one and decides to ruin it, what a trooper
The most historical game
One must imagine Sisyphus happy.
You correct no one lives in Biloxi, because that's not where the city is
when you try the chaos mod, shit in 1938 is normal.
when I try it, everyone looses their minds.
im addicted to this series
15:47
(As Sweden is nuked in the upper-right of the video)
HollowBeing: Gets an economic crisis
Also HollowBeing: Oh no! Anyway,
This felt like a fever dream
first event: the fall of berlin
This is sisyphus coded
Lore of HOI4'S MOST ENRAGING MOD momentum 100
I hat physical pain watching this
yo that august von mackensen dude was in his fucking 90s at this point
Hey mate, do you have a playlist with your background music?
History channel at 3 am:
Biloxians... rise up!
kinda hard to rise up when biloxi isn't real!
Cool mod, weird narration
This war i like 1984. A war that never ends.
I feel your pain brudha 😔
Next formable with the same mod, BUT 15 day deadline. Suffering like never before ( i tried it as germany, painful af)
Kaiserreich mod be like
17:54 Why are there 3 Us and why is one of them just Kentucky
American here. Biloxi didn’t actually exist, it’s all just superstition (I live in Maine so I definitely know what I’m talking about in regards to Louisiana)
Funny because mefo bills are amazing -33% consumers
Concentrated industry is the best in single player .
Hey. Im the guy from discord who sleeps to your content, hi.
I dont like to say this but there is a problem with the sound. weird sounds come from the right side.
i LOVE YOUR channel SUCH GOOD content LOVE content
you deserve a like, a subscribe and a comment
Sanest HOI4 game I have ever seen!
if hoi4 player goes to hell that's what he gonna have to do but without cheating :skull:
Most sane ahistorical HOI4 mod.
griefer jezus is from gta 5
am i crazy or is there smth wrong with schleswig holstein
LMAO at disband all factions , the results were funny
Bro I was stationed in Biloxi at Keesler Air Force Base. Place exists and it's a shithole
listen i live in beloxi i speak the native beloxian tongue you must trust me on that
Dont mind if I do 44:20
Yo bro I'm actually concerned for ur well being, are you ok, man?
Biloxi is just casinos and seafood
39:08 LMFAO
Is there a English version of the mod?
This IS the English version. Can you not read?
@@InfiniteDeckhandIs there an English version of the mod?
I don’t think so
yeah it the normal mod
@@theman-wq5on
I sure hope the 2008 president didn't also cause issues in the middle east
(insert joke here I guess)
u dont need "PARIII" (french musiiiic) u need stalingrad and st.petersburg🤓