top of page



THE SECURITY RISKS OF MOBILE PHONES IN 2026

Underbellymagazine Edition — No Illusions, No Comfort, Just the Truth.


If you think your phone’s just a phone, you’re already in trouble.
In 2026 it’s the most dangerous thing you own — and you carry it everywhere.
People still talk about “privacy settings” and “permissions” like they mean something. They don’t. Not anymore. Your phone isn’t a communication device. It’s a portable surveillance kit stuffed with sensors, radios, microphones, cameras, trackers, and software you’ve never heard of. And every one of them is desperate to report back to somebody. Governments. Advertisers. Data brokers. App developers. Companies you’ve never heard of in countries you’ll never visit. Everyone wants a piece of you, and your phone hands it out like free samples.
Your phone listens even when you don’t speak. It watches even when the screen is black. It tracks you even when GPS is off. It identifies you even with a burner. It remembers everything you forget. People say, “I’ve nothing to hide.” Aye, well, you’ve plenty to lose. Your movements, your habits, your relationships, your health, your voice, your face, your fingerprints, your private messages. Your entire life story, sold and analysed by systems you’ll never see. And the worst part is you carry it everywhere and treat it like a pal. It’s not your pal. It’s a tattletale in your pocket.



ACTIVE LISTENING: THE NEW THREAT

Your phone listens to you constantly, and not in the conspiracy‑theory way folk used to laugh about. In 2026, it’s simply how the technology works. Every modern phone has multiple microphones tied into systems that are always waiting, always analysing, always hungry for sound. Wake words like “Hey Siri” or “OK Google” are the children’s‑menu version. The real listening happens long before you say anything.
To detect a wake word, the phone must listen to everything first. That’s the official listening. The unofficial stuff is worse. Phones now use ambient audio analytics to detect background conversations, TV shows, music, noise levels, whether you’re alone, your emotional tone, your stress level, and your routine. They don’t need your words. They need the patterns.
Apps that have no business accessing your microphone demand it anyway. Torch apps, shopping apps, games, weather widgets. They want your audio because it’s profitable. They can detect what shops you’re in, what adverts you’re exposed to, what TV shows you watch, who you’re with, and how you live. And they sell that data on.
Cross‑device audio tracking is everywhere. Your phone, your smart TV, your laptop, your smart speaker — they all listen for ultrasonic beacons hidden in adverts and apps. You can’t hear them, but your devices can. When they hear the same beacon, they know you’re in the room, what you’re watching, who you’re with, and what you’re doing.
AI makes it worse. It can identify your voice instantly, detect stress, guess your mood, recognise who else is in the room, and build a psychological profile from tone alone. Turning the microphone off doesn’t save you. Some apps ignore the setting. Some phones fake the setting. Some malware pretends the mic is off when it’s not. And some systems use other sensors to infer what’s happening even without audio.
Your phone listens because listening is profitable. It listens because it’s useful to companies and governments. And unless you take control of it, it’ll keep listening long after you stop talking.



ZERO‑CLICK ATTACKS & BASEBAND EXPLOITS

People used to think you only got hacked if you clicked something stupid. That was 2010 thinking. In 2026, you can be hacked while your phone sits in your pocket doing nothing. Zero‑click attacks require no link, no message, no app, no mistake. Your phone simply receives something — a text, a call, a push notification, a malformed image — and you’re compromised. Pegasus worked this way. Modern spyware works this way. Governments and private intelligence firms use it daily.
Baseband exploits are worse. Your phone has two operating systems: the one you see (Android or iOS) and the one you don’t — the baseband, which controls the radio. If someone hacks the baseband, they bypass your lock screen, your encryption, your apps, your antivirus, your settings, your permissions, your entire OS. It’s like breaking into your house by tunnelling under the foundations while you argue about which lock to buy.
VPNs don’t help. Antivirus doesn’t help. These attacks go underneath everything you think protects you. AI now finds vulnerabilities, crafts exploits, scans millions of devices, adapts in real time, and hides traces better than any human hacker. If someone with the right tools wants into your phone, they don’t need your help. They just need your phone to be switched on.



LOCATION TRACKING IN 2026

Turning off GPS doesn’t stop tracking. Your phone can betray your location through mobile towers, Wi‑Fi networks, Bluetooth beacons, UWB pulses, nearby devices, your movement patterns, your routine, your car, your smart meter, and your TV. You’re not hiding. You’re glowing.
Five‑G triangulation is precise to a few feet, sometimes inches. Bluetooth beacons in shops, buses, stadiums, and airports track how long you stand in an aisle, what you walk past, how often you visit, and who you’re with. UWB can track you to the inch. Your car logs your routes and your phone’s identifiers. Your smart TV knows when your phone is in the room. Your smart meter knows when you’re home, asleep, or away.
Even if you disable everything, your behaviour gives you away. AI can identify you by your walking speed, your daily routine, the places you visit, the times you move, the people you’re near, and the shops you frequent. You’re not anonymous. You’re predictable. And predictable is easy to track.

Half the apps on your phone ask for microphone access for reasons that make no sense.
A torch app. A shopping app. A game. A weather widget. Why do they need your microphone?
They don’t. They want it. Because once they have it, they can: detect what shops you’re in hear what adverts you’re exposed to identify what TV shows you watch match your background noise to other devices nearby, and even build a map of your social life. And then they sell that data on like it’s nothing. Marketers and brokers buy the data from your device in milliseconds. Hence why you can speak about a product and in seconds you open social media and ads for the exact item you were talking about are there.


APP SURVEILLANCE & DATA BROKERS

Hackers aren’t the biggest threat to your privacy. Apps are. They don’t break in — you invite them in. And once inside, they help themselves to everything. A calculator wants your location. A shopping app wants your microphone. A game wants your contacts. A wallpaper app wants your storage. Because the app isn’t the product. You are.
Permissions are a polite fiction. Some apps ignore them. Some bypass them. Some use third‑party SDKs that hoover up data anyway. Apps track you even when you’re not using them. They wake themselves up, ping servers, scan nearby devices, check your location, read your clipboard, fingerprint your phone, and sync data to the cloud.
Data brokers then build dossiers on you: your movements, shopping habits, political leanings, relationships, health concerns, financial stress, addictions, routines, personality traits, emotional state, and likely future behaviour. They sell this to advertisers, insurance companies, private intelligence firms, political campaigns, hedge funds, foreign companies — anyone with money.
“Anonymous data” is a joke. AI can re‑identify you from your movement patterns, typing rhythm, app usage, location history, purchase habits, social graph, sleep cycle, commute, Wi‑Fi networks, and Bluetooth fingerprints. You’re not anonymous. You’re a walking barcode.



BEHAVIOURAL BIOMETRICS: THE END OF ANONYMITY

Changing your phone, SIM, number, email, or haircut doesn’t matter. Your behaviour identifies you. Your walking pattern is unique. Your typing rhythm is unique. Your touchscreen behaviour is unique. Your routine is unique. AI can identify you by your stride, your balance, your swipe speed, your scroll habits, your unlock pattern, your daily movements, your emotional state, and your stress levels.
You can’t change your behaviour enough to fool modern systems. Behavioural biometrics have killed anonymity. Your phone doesn’t just know who you are. It knows how you are.

Your phone doesn’t just hear you. It studies you.


BURNER PHONES IN 2026: WHY THEY DON’T WORK

Burner phones used to work. Not anymore. SIM swapping is useless because networks track the device, the IMEI, the radio fingerprint, and your movement patterns. Burner phones still have sensors that betray your behaviour. Location correlation links burners to your real life. Call‑pattern matching identifies you instantly. Buying a burner isn’t anonymous because shops are filled with CCTV, face recognition, Bluetooth beacons, Wi‑Fi probes, and device‑matching systems.
You can’t remove the battery anymore. Phones fake shutdown. Sensors stay alive. Baseband radios stay connected. A burner phone doesn’t hide you. It just gives surveillance systems a new device to attach your behaviour to.




“TURNING YOUR PHONE OFF” IN 2026

Turning your phone off isn’t the nuclear option anymore. Fake‑off malware exists. Sealed batteries prevent true shutdown. Low‑power sensors stay active. Baseband radios operate independently. Airplane Mode is a suggestion, not a guarantee. The only reliable option is physical separation. If you don’t want your phone listening, tracking, or analysing you, don’t bring it into the room.

Turning the mic off is like telling a burglar, “Don’t look in that drawer.


SECURE MESSAGING APPS: SIGNAL, TELEGRAM, THREEMA

Signal is excellent for privacy but poor for anonymity because it requires a phone number. Telegram is fast and popular but insecure unless you use Secret Chats. Threema is the most private and doesn’t require a phone number, but fewer people use it. Messaging apps protect messages, not you. If your phone is compromised, encryption doesn’t matter.

THE UNDERBELLYMAGAZINE SECURITY CHECKLIST (2026)

You can’t make a modern phone safe. You can only make it less dangerous.
On Android, disable Google tracking, remove bloatware, lock down permissions, use better apps, install a firewall, and keep the device updated. On iOS, disable analytics, limit permissions, turn off unnecessary features, and use privacy‑respecting apps. On Windows, kill telemetry, use a hardened browser, avoid Microsoft accounts, disable background apps, and encrypt your drive. Linux remains the only OS that doesn’t treat you like a product. It has no forced telemetry, is open source, has a minimal attack surface, and gives you full control.



FINAL UNDERBELLYMAGAZINE WARNING

You’re living in a world where your phone listens, your apps watch, your OS reports, your network logs, your behaviour identifies you, your location betrays you, and your data is bought and sold. Most people walk around with this surveillance device glued to their hand like it’s a comfort blanket.
You can’t fully secure a modern phone. You can only limit the damage. Real privacy requires discipline. It requires distance. It requires understanding the world you’re living in, not the world you wish you were living in.
In 2026, privacy isn’t a setting. It’s a fight — one most people don’t even realise they’re losing.



Here is an indepth look at some of the terminoligy used. Also the research used to summarise the article



“But I Use a VPN” — Aye, That’s Cute

VPNs protect your internet traffic. Zero‑click and baseband attacks don’t care about your internet traffic. They go underneath it. They go around it. They go straight into the hardware. A VPN is like wearing a raincoat during a house fire.


Does AI Makes These Attacks Worse

In 2026, AI is used to:

Find new vulnerabilities, Craft perfect exploit payloads, Scan millions of devices at once. Adapt attacks in real time. Hide traces better than any human hacker. It’s not one guy in a hoodie anymore. It’s automated, industrial‑scale exploitation.


“But I’ve Got Antivirus” — Aye, And I’ve Got a Biscuit Tin for a Helmet

Antivirus can’t see: Baseband infections. Zero‑click payloads. Kernel‑level implants. Firmware‑level spyware. Silent persistence modules
Compromised system libraries. Malicious radio‑layer code.
Antivirus is for catching yesterday’s malware.. Zero‑click attacks are tomorrow’s.


If someone with the right tools wants into your phone:


They don’t need your help.. They don’t need you to click anything.. They don’t need you to be careless.. They just need your phone to be switched on.. And in 2026, that’s enough.

Location Tracking in 2026: You Can Run, But Your Phone Won’t

Folk still think turning off GPS means they can’t be tracked. Aye, that was cute in 2012.
In 2026, your phone can betray your location in more ways than you’ve had hot dinners — and half of them don’t even involve satellites.


GPS Is the Least of Your Worries

GPS is old news.

It’s the obvious tracker — the one you can switch off. But here’s the truth: Your phone doesn’t need GPS to know exactly where you are.
It can use: Mobile towers. Wi‑Fi networks. Bluetooth beacons. Ultra‑wideband (UWB) pulses. Nearby devices. Your movement patterns. Your routine. Your car. Your smart meter. Your smart TV. You’re not hiding.. You’re glowing like a lighthouse.


 
5G Triangulation — Pinpoint Accuracy Without Your Permission

5G towers are everywhere — lamp posts, rooftops, street corners. They’re close together, and that means precision. Your phone constantly shouts: “Here I am!”, “Still here!”, “Moving now!” “Slowing down!”, “Stopped at the chippy!” And the network logs every bit of it.

Accuracy?

Down to a few feet. Sometimes inches. You don’t need to open an app. You don’t need to connect to Wi‑Fi. You don’t need to do anything.
Just having your phone switched on is enough.


Bluetooth‑LE Beacons — The Silent Grasses Everywhere You Go

Shops, buses, stadiums, airports, train stations — they’re all filled with tiny Bluetooth beacons.

You can’t see them. You can’t hear them. But your phone does.
They track: How long you stand in a shop aisle. Which products you walk past. How often you visit. Who you’re with. What route you take through a building.
And they sell that data on like it’s nothing.


Turn Bluetooth off?
Some phones still scan. Aye, that’s the part they don’t advertise.

UWB (Ultra‑Wideband) — Tracking You to the Inch

UWB is the new darling of the tech world.

It’s used for: Car keys. Smart tags “Find My” networks - indoor navigation. Device pairing. And it’s accurate enough to tell which side of the room you’re standing on.
If you’ve ever wondered how your phone magically knows where your lost earbuds are, here’s the answer:. It’s constantly scanning the environment.
And that data doesn’t just vanish into thin air.


Your Car, TV, and Smart Home Are Snitching Too

In 2026, your phone doesn’t track you alone.

It’s got pals. Your car. Modern cars log:. Your phone’s MAC address. Your routes. Your call historyYour contacts. Your Bluetooth fingerprints
And some of them upload it to the manufacturer.
Your smart TV. It knows when your phone is in the room. It knows what you’re watching. It knows who else is there.. Your smart meter. Your electricity usage patterns can reveal:. When you’re home.When you’re asleep. When you’re away. And your phone’s presence confirms it.

Everything talks to everything now.And everything reports back.

Behavioural Location Tracking — You’re a Pattern, Not a Person

Even if you:

Turn off GPS.Disable Bluetooth.Avoid Wi‑Fi
Use a VPN.Use a burner phone…your behaviour gives you away.
AI can identify you by: Your walking speed. Your daily routine. The places you visit. The times you move. The people you’re near. The shops you frequent. You’re not anonymous. You’re predictable. And predictable is easy to track.


Your phone doesn’t just know where you are.

It knows where you’ve been. Where you’re going. Who you’re with.. What you’re doing.
What you’re likely to do next...You can run.. You can hide.. But your phone won’t.
It’ll grass you up every time.


App Surveillance & Data Brokers: The Legal Spy Agencies

If you think hackers are the biggest threat to your privacy, you’ve not been paying attention.
The real danger isn’t some hoodie‑wearing genius in a basement — it’s the apps you installed yourself, smiling away on your home screen like they’re doing you a favour. In 2026, apps are the legal spy agencies. They don’t break in. You invite them in. And once they’re in, they help themselves to everything.


Apps Don’t Want Access — They Want Your Life Story

Every app wants permissions it has no business asking for: A calculator wanting your location A shopping app wanting your microphone. A game wanting your contacts. A wallpaper app wanting your storage. A torch app wanting your camera. Why?
Because the app itself isn’t the product. You are. Your data is worth more than the app ever will be.

Permissions Are a Polite Fiction

Folk think tapping “Allow” or “Deny” actually controls something. Aye, that’s adorable.
Here’s the truth: Some apps ignore permissions. Some apps bypass them using other sensors. Some apps use third‑party SDKs that hoover up data anyway. Some apps send data to companies you’ve never heard of. Some apps track you even when you’re not using them
Permissions are like a “Do Not Enter” sign on a door that’s already off its hinges.


Data Brokers — The People Who Know You Better Than You Do

This is where it gets grim.

Apps don’t just collect your data. They sell it — to data brokers. Data brokers then build a dossier on you that includes:
Your movements.Your shopping habits. Your political leanings. Your relationships. Your health concerns. Your financial stress. Your addictions. Your routines. Your personality traits. Your emotional state. Your likely future behaviour. And they sell that dossier to: Advertisers. insurance companies. Private intelligence firms. Political campaigns. Hedge funds. Foreign companies. Anyone with a credit card. You’re not a customer.. You’re a product on a shelf.


“Anonymous Data” — Aye, Pull the Other One

Companies love to say your data is “anonymised.”
Here’s the truth:


There’s no such thing as anonymous data anymore.
AI can re‑identify you from:bYour movement patterns. Your typing rhythm. Your app usage. Your location history. Your purchase habits. Your social graph. Your sleep cycle. Your commute. Your Wi‑Fi networks. Your Bluetooth fingerprints. You’re not anonymous.You’re a walking barcode.


Apps Track You Even When You’re Not Using Them

Background activity is where the real spying happens.
Apps can:

Wake themselves up. Ping servers. Scan nearby devices. Check your location. Read your clipboard. Monitor your network. Fingerprint your phone. Sync your data to the cloud. All while you think the app is “closed.” Closing an app is like closing your eyes and pretending the room disappears.

AI Turns App Surveillance Into Behavioural Prediction

In 2026, AI doesn’t just collect your data — it predicts you.

It can guess: When you’re stressed. When you’re lonely. When you’re vulnerable. When you’re likely to spend money. When you’re likely to relapse. When you’re likely to vote.When you’re likely to argue. When you’re likely to break up .When you’re likely to get sick. And companies use that to manipulate you..Not by accident. By design.

Apps don’t need to hack you.

They don’t need to trick you.They don’t need to break the law.You hand them everything they want — because the whole system is built to make you think you have no choice.Your phone isn’t leaking data. It’s exporting it.And the folk receiving it know more about you than your pals, your partner, and probably yourself.

Behavioural Biometrics: The End of Anonymity

Folk still think anonymity means “using a different phone” or “not giving your name.” Aye, that worked back when Snake was the height of mobile gaming. In 2026, you can change your phone, your SIM, your number, your email, your haircut — doesn’t matter.
Your behaviour identifies you. And you can’t switch that off. Behavioural biometrics are the silent fingerprint you didn’t know you were leaving everywhere. And they’re far more accurate than anything you can put your thumb on. Let’s break down the ways your phone grasses you up without ever needing your name.

Gait Recognition — Your Walk Is a Barcode

Your phone’s sensors — accelerometer, gyroscope, magnetometer — constantly measure how you move.

And here’s the kicker: Your walking pattern is unique. As unique as your fingerprint.
As unique as your face. AI can identify you by: Your stride length
Your hip movement. Your arm swing. Your speed. Your balance. How your weight shifts. Even if you’re carrying a different phone, the pattern is the same.. You can swap SIMs.. You can swap devices. But you can’t swap your legs.


Typing Rhythm — Your Fingers Snitch Too

Every time you type, your phone records:

How fast you type. How hard you press. How long you hold each key. The gaps between letters
Your autocorrect patterns.Your swipe speed. And guess what? It’s unique. You type like you, and nobody else.
Apps use this to: Identify you. Track you. Predict your mood. Detect stress. Build a behavioural profile. Even if you’re using a “private” app, your typing rhythm gives you away.


Touchscreen Behaviour — The Way You Tap Is a Signature

Your phone knows:

How fast you scroll. How you flick. How you pinch. How you zoom. How you hold the device. How often you check it. How you unlock it
How you swipe notifications
This is behavioural gold. Companies use it to: Identify you across apps. Detect fraud. Personalise ads. Build psychological profiles


Governments use it to:

Link burner phones. Identify anonymous users. Track dissidents. Match devices to individuals, your touchscreen is a tattletale.

Movement Patterns — Your Routine Is a Dead Giveaway

Even if you:

Turn off GPS. Disable Bluetooth. Avoid Wi‑Fi. Use a VPN. Use a burner phone…your routine betrays you.

AI can identify you by:

The time you wake up. The time you leave the house. Your commute. Your lunch habits. Your gym schedule. Your weekend patterns
Your social circles. Your favourite shops. Your travel routes. You’re not anonymous. You’re predictable.And predictable is easy to track.



Emotional Biometrics — Your Phone Knows Your Mood

Your phone can detect:

Stress. Anger. Excitement. Boredom. Anxiety. Fatigue. How?
Through: Voice tone. Breathing patterns. Micro‑movements. App usage. Screen interaction speed. And companies use this to manipulate you. Not by accident.By design.. You can change your phone.You can change your SIM.
You can change your number.. You can change your apps.. But you can’t change you.. Your behaviour is the ultimate identifier — and in 2026, it’s the one thing you can’t hide.
Your phone doesn’t just know who you are.It knows how you are. And that’s enough to track you anywhere.


Burner Phones in 2026: Why They Don’t Work Anymore

Back in the day, folk thought a burner phone made them invisible. Buy it with cash, swap the SIM, ditch it after a week — job done.
Aye, that worked when phones were dumb and networks were dumber. But in 2026, a burner phone is about as anonymous as wearing a balaclava with your name stitched on the front.


SIM Swapping Is Useless — The Network Tracks the Device Too

Folk still think swapping SIM cards hides them. Nope.

The network logs:. The SIM. The device. The IMEI. The radio fingerprint. The tower pattern. The movement behaviour. Swap the SIM all you want — the network still knows it’s you...It’s like changing your jacket and thinking your pals won’t recognise you.

Burner Phones Still Have Sensors — And They Still Grass You Up

A burner phone still has:


Accelerometers. Gyroscopes. Magnetometers. Bluetooth. Bi‑Fi. BWB. Bicrophones. Baseband radios. And every one of those sensors leaks behavioural data.. You can’t hide your walk.. You can’t hide your typing rhythm.. You can’t hide your habits.. A burner phone doesn’t erase your identity — it just gives you a new device to betray it with.

Location Correlation — The Silent Killer of Anonymity

Even if you:

Buy the phone with cash.Activate it in a random place. Never call your usual contacts. Never log into anything. Never install apps…your location overlaps with your real life.

If your burner:

Sleeps where you sleep. Travels where you travel. Sits in your pocket at work. Moves in sync with your real phone. Visits your usual shops
Then congratulations — you’ve just linked the burner to yourself.. AI doesn’t need your name.. It needs your routine. And your routine is loud.


Call‑Pattern Matching — PROTON on Steroids

In 2013, PROTON could match a new number to a person based on call patterns.


In 2026, that tech is:

Instant. Automated. AI‑driven. Commercially available. Used by governments and private firms

If you call:

Your mum. Your work. Your pals. Your partner. Your dentist…even once, the system knows it’s you. And even if you don’t call them —if you call people who know the same people, you’re still caught. Your social graph is a fingerprint.

Buying a Burner Isn’t Anonymous Either

Shops are crawling with:

CCTV. Face recognition. Bluetooth beacons. Wi‑Fi probes. Loyalty card tracking. Payment correlation. Device‑matching systems. Even if you pay cash, the shop knows:
When you entered. What you touched. What you bought. What device you had on you. Who else was in the shop. What cameras saw your face. You’re not anonymous.. You’re just optimistic.


“Turning Your Phone Off” in 2026: The Hard Truth"

Folk still think turning their phone off is the nuclear option — the big red button that magically makes them invisible.
Aye, that was the dream.
But in 2026, “off” doesn’t always mean off, and your phone can still grass you up long after the screen goes black
.

Fake‑Off Malware — Your Phone Pretends to Sleep While It Spies

This isn’t sci‑fi.. This isn’t paranoia.. This is documented, real, and used in the wild.

Malware exists that: shuts the screen off, Kills the UI, Stops notifications, Pretends the phone is dead…but keeps: The microphone active, 
The baseband connected, Sensors running, Logs being written Data queued for upload, You think the phone’s off. The phone knows it’s not.
It’s like telling a toddler to go to bed and hearing them sprint around the room the second the door closes.


Sealed Batteries — You Can’t Pull the Plug Anymore

Back in the day, you could yank the battery out and know for sure the phone was dead.
In 2026?
Sealed batteries, Glued backs. Proprietary screws. Tamper‑detect systems ALSO, “battery health” chips. Power‑reserve modes. Even when “off,” many phones keep a tiny reserve of power for:. Find My Device. Emergency beacons. Network pings. Background diagnostics. You can’t remove the battery. You can’t kill the power. You can’t trust the off switch.


Sensors Don’t Need Power to Betray You

Here’s the part nobody expects: Some sensors can still leak information even when the phone is “off” because they’re tied to low‑power subsystems.
These include: Accelerometer. Gyroscope. Magnetometer. NFC. UWB. Bluetooth LE (in some models) These can reveal: Movement. Proximity. Direction. Nearby devices. Whether the phone is being handled. Your phone doesn’t need to be “on” to know what you’re doing.


Baseband Radios Can Stay Alive Independently

The baseband — the radio that talks to the mobile network — is its own little computer.
And it can: Stay powered. Stay connected. Receive silent SMS. Receive paging signals. Respond to tower queries…even when the main OS is “off.”. If the baseband is alive, the phone is alive.
End of story.


 “Airplane Mode” — Aye, That’s a Suggestion, Not a Guarantee

Airplane Mode is a software switch.
Software can lie. Some malware:. Fakes Airplane Mode. Fakes Wi‑Fi off. Fakes Bluetooth off. Fakes mobile data off. You see a wee plane icon. The phone sees a green light to keep talking behind your back.


The Only Reliable Option: Physical Separation

Here’s the Glasgow truth:
If you don’t want your phone listening, tracking, or analysing you — don’t bring it into the room.. Not “turn it off.”. Not “Airplane Mode.”. Not “wrapped in a sock.”. Not “face down on the table.” Leave it: in another room, in a metal box, in a drawer, in the car, anywhere except beside you. Distance is the only real privacy left. Your phone can’t spy on what it can’t hear.


Secure Messaging Apps in 2026: Signal, Telegram, Threema (Pros & Cons)

Folk love to argue about messaging apps like they’re football teams.
“Aye, Signal’s the best!”
“Naw, Telegram’s pure magic!”
“Threema’s Swiss, so it must be safe!”
Calm down. Every app has strengths. Every app has weaknesses.
And none of them turn your phone into a fortress — because the phone itself is the weakest link. But if you’re going to use encrypted messaging, you might as well know what you’re actually dealing with. Let’s break them down properly.


SIGNAL — The Gold Standard (But Not a Magic Shield)

Signal is the one the security folk swear by, and for good reason. But it’s not perfect, and it’s not anonymous.
Pros
End‑to‑end encryption by default. No faffing about with settings. Everything’s encrypted. Open source. Anyone can inspect the code. No hidden nonsense. Trusted by security researchers. If it was dodgy, someone would’ve shouted by now. No ads, no trackers. They’re not trying to sell your soul.. Disappearing messages. Handy when you don’t want a chat history following you around.
Cons. Requires a phone number. That alone kills anonymity. Your number is a permanent identifier. Metadata still exists. Who you talk to, when, how often — that’s still visible to the network. Your phone can still be compromised. If your device is hacked, the encryption doesn’t matter. Backups can leak if you’re careless. Cloud backups are a liability.

Glasgow Verdict: Brilliant for privacy. Useless for anonymity. If you want secure chats with folk you trust, Signal’s your pal. If you’re trying to disappear, it’s not enough.


TELEGRAM — Popular, Fast, and Full of Holes

Telegram is the app folk think is secure because it looks fancy and has big channels.
Aye, it’s fast. Aye, it’s convenient. But secure? Only if you use it properly — and most folk don’t.

Pros

Huge user base. Everyone and their granny is on it.. Fast, smooth, feature‑packed. It works well. No denying that. Secret Chats are end‑to‑end encrypted. But only if you turn them on. Doesn’t require your real name. You can use a username instead.

Cons

Normal chats are NOT end‑to‑end encrypted. Telegram can read them.. Their servers can read them.. Anyone who hacks them can read them. Cloud chats stored on Telegram’s servers. That’s a massive attack surface. Closed‑source server code. You’re trusting them blindly. Metadata galore. Who you talk to, when, how often — all logged.
`
Glasgow Verdict: Great messenger. Terrible for privacy unless you use Secret Chats. If you’re using Telegram for anything sensitive in normal chat mode, you’re basically shouting out the window.


THREEMA — The Quiet Swiss Tank

Threema is the one folk don’t talk about much — which is ironic, because it’s one of the most privacy‑respecting apps out there.

Pros
No phone number required. Massive win for anonymity..Swiss privacy laws. Better than most countries. End‑to‑end encryption everywhere
No exceptions. Open‑source clients. Transparent and trustworthy.. Minimal metadata.They collect as little as possible.

Cons
Paid app. A couple of quid, but still a barrier. Smaller user base. Harder to convince your pals to join. Closed‑source server code. Not ideal, but still better than Telegram’s cloud model. Not as feature‑rich. It’s built for privacy, not bells and whistles.

Glasgow Verdict:
The most private of the three — if you can get folk to use it. If anonymity matters, Threema beats Signal and Telegram hands‑down.



The Hard Truth

No messaging app can save you if: Your phone is compromised. Your OS is leaking data. Your backups are insecure. Your behaviour gives you away. Your contacts are sloppy. Your device is tied to your identity. Encryption protects the message.
It does not protect:. Your metadata. Your identity. Your behaviour. Your device. Your location. Your social graph. Apps are tools. Your phone is the risk.



 The Security Checklist (2026)

This is the part folk always jump to — the checklist. But here’s the truth before we start: You can’t make a modern phone “safe.” You can only make it less dangerous. Phones are built for surveillance. Operating systems are built for data extraction. Apps are built for profiling. Networks are built for tracking. But you can harden things. You can reduce the damage. You can stop handing your life away on a silver platter.
Here’s the Underbelly‑approved, no‑nonsense checklist for 2026.


ANDROID SECURITY CHECKLIST (2026)

Android is the wild west — powerful, flexible, but noisy as hell. If you’re on Android, you need to be twice as sharp.
1. Disable the Grassers
Turn off:Google Location History. Google Web & App Activity. Google Ad Personalisation. Nearby Share. Bluetooth scanning. Wi‑Fi scanning. “Improve accuracy” settings (they improve tracking, not accuracy)

2. Kill the Bloat
Uninstall or disable:
Facebook. Instagram. TikTok. Snapchat. Chrome. Manufacturer bloatware. Anything preinstalled that you don’t use. If you don’t recognise it, bin it.

3. Use Better Apps
Replace:
Chrome → Fireox or Brave. Gmail → Proton Mail. Google Messages → Signal. Google Maps → Organic Maps or Osm.

4. Lock Down Permissions
For every app:
No location unless absolutely needed. No microphone unless absolutely needed. No camera unless absolutely needed. No contacts, ever. No background activity unless essential. If an app demands too much, delete it.

5. Harden the Device
Use a strong passcode (not a pattern). Disable biometrics if you’re worried about coercion. Turn off “Nearby Devices” permissions. Disable “Install unknown apps”. Turn off personalised ads.

6. Use a Firewall
Install NetGuard or RethinkDNS. Block everything that doesn’t need the internet.

7. Keep It Updated
Old Android = open door.


iOS SECURITY CHECKLIST (2026)

Apple is better at privacy, but don’t kid yourself — it still collects plenty.

1. Kill the Tracking
Turn off: Significant Locations. Analytics & Improvements. Personalised Ads. iCloud Keychain if you don’t trust syncing. Siri suggestions
Background App Refresh (for most apps)

2. Lock Down Permissions
For every app:
Location → “While Using” or “Never”. Microphone → Off unless essential. Camera → Off unless essential. Contacts → Never. Bluetooth → Off unless needed. Photos → “Selected Photos” only

3. Use Better Apps
Replace:
Safari → Firefox Focus or Brave. Mail → Proton Mail. iMessage → Signal for sensitive chats.

4. Disable the Snitches
Turn off AirDrop. Turn off Find My Network. Turn off “Share Analytics with App Developers”

5. Hardening
Use a long passcode. Disable Face ID if you’re worried about forced unlocks. Turn off Lock Screen widgets (they leak info)



WINDOWS SECURITY CHECKLIST (2026)

Windows is convenient — and noisy. It phones home more than a homesick student.

1. Kill Telemetry
Turn off: Diagnostic data. Tailored experiences. Advertising ID. Location services. Activity history. Sync settings.

2. Harden the Browser

Use:
Firefox + uBlock Origin. Brave. DuckDuckGo search
Avoid:
Edge (too much tracking) Chrome (Google vacuum cleaner)

3. Local Accounts Only

Do not use a Microsoft account unless you absolutely must.

4. Disable Background Apps

Most of them are pointless and noisy.

5. Encrypt the Drive
Turn on BitLocker (Pro) or VeraCrypt (Home).

6. Keep Software Minimal
The fewer apps, the fewer leaks.



WHY LINUX IS STILL THE BEST FOR PRIVACY (2026)


Linux isn’t magic — but it’s the only OS that isn’t built to spy on you.

1. No Forced Telemetry. Linux doesn’t phone home.. It doesn’t care who you are.. It doesn’t want your data.

2. Open Source. Anyone can inspect the code. No hidden tracking.. No secret analytics.

3. Minimal Attack Surface. No bloat.. No adware. No preinstalled nonsense.

4. You Control Everything. Permissions.. Updates.. Networking. Services.
Everything is yours to configure.

5. Best Distros for Normal Folk. Linux Mint (easy, stable). Ubuntu (popular, supported). Fedora (modern, secure defaults). Tails (for high‑risk, amnesic sessions)
Underbelly Verdict:
If you want real privacy, Linux is the only OS that doesn’t treat you like a product.You can’t make your phone safe.
You can only make it less dangerous. But you can make your computer safe — and Linux is the closest thing to digital freedom left.

Threema seems to be the most secure messaging app on the market, but after reading below should phones or digital communication be used at all? Look at the best three secure messaging apps. Threema, Telegram, Signal. (links).I know which I use. However always choose your own. Don't be swayed to use any messaging app recommended to you. Research.

EncroChat: Cracked or Plant?

 

In 2020, French authorities successfully infiltrated the EncroChat encrypted phone network, which was widely used by criminal organizations for secure communication.

 

Here's how they did it: EncroChat was one of the largest providers of encrypted communication services, with an estimated 60,000 users across Europe. The service was popular among criminal networks because it offered highly secure messaging, with features like the removal of microphones, cameras, and GPS from devices to prevent tracking and interception.

EncroChat used end-to-end encryption to secure its users' communications. This encryption method ensures that only the communicating users can read the messages, making it extremely difficult for anyone else, including the service provider, to intercept and decipher the content.

Understanding End-to-End Encryption: End-to-end encryption (E2EE) is a method of data transmission that ensures only the communicating users can read the messages. It provides a high level of security by encrypting the data at the sender's device and only decrypting it at the recipient's device.

Here's a deeper look at how it works: When you send a message, your device encrypts it before it leaves your device. This encryption uses a unique key that only the intended recipient has the decryption key for. The encrypted message is transmitted over the internet. While in transit, it remains encrypted and unreadable to anyone who might intercept it, including service providers and potential hackers. When the encrypted message reaches the recipient's device, it is decrypted using the unique key stored on that device. Only the recipient's device can decrypt the message, making the data readable again.

Each user has a pair of cryptographic keys – a public key (known to everyone) and a private key (kept secret). The sender encrypts the message using the recipient's public key, and the recipient decrypts it using their private key.

Even if someone intercepts the message during transmission, they cannot read it without the private key. This ensures the privacy and security of the communication.

End-to-end encryption is widely used in messaging apps (like WhatsApp, Signal), email services, and other secure communication platforms. This openly indicates that the claim these applications are safe and keep your data protected is only true to a degree. IF, EncroChat was hacked by French Authorities then no End-to-End Encryption is safe. We at Underbelly know the level of security E2EE provides and how difficult it is to burgle.

E2EE should ensure that only the intended recipient can read the messages, protecting against eavesdropping and unauthorized access. Protect data integrity during transmission, ensuring it cannot be tampered with or altered by even to top Pen Testers. However, users must securely manage their encryption keys. Losing a private key means losing access to the encrypted data.

E2EE is so complex that simply implementing end-to-end encryption requires robust technical infrastructure and expertise.

End-to-end encryption developers claim it provides a robust method for securing communications, ensuring that data remains private and secure from the moment it leaves the sender's device until it is decrypted by the recipient. While it presents some challenges, its benefits in protecting privacy and data integrity make it a crucial tool in modern digital communications.

End-to-end encryption (E2EE) is designed to be highly secure, making it extremely difficult to hack. Here are some key points to consider - E2EE uses robust encryption algorithms like AES (Advanced Encryption Standard) and RSA (Rivest-Shamir-Adleman), which are widely regarded as the most secure and have withstood extensive cryptographic analysis and penetration testing. A communication session uses unique encryption keys, which are only known to the sender and recipient. This means even if one session is compromised, other sessions remain secure. Successfully breaking E2EE encryption without access to the private keys is computationally infeasible with current technology. It would require an enormous amount of time and resources, making it impractical for most attackers.

security of E2EE also depends on the security of the devices at both ends of the communication. If a device is compromised (e.g., through malware), the encryption can be bypassed.

While no system is entirely hack-proof, end-to-end encryption provides a very high level of security, making it one of the best methods available for protecting sensitive communications. However, it's essential to ensure proper implementation and maintain good security practices on the devices involved.

 

The Investigation

​Encrochat was not the first investigation of this kind. The FBI pulled off one of the boldest tech stings in recent history with a covert operation called Trojan Shield, centered around a fake encrypted phone company named Anom:

Operation Trojan Shield ran from October 2018 to June 2021:

 

  • October 2018: The first ANOM devices were distributed to criminal networks for beta testing.

  • Mid-2019 to 2021: The platform gained traction globally, with thousands of devices in use.

  • 8 June 2021: Law enforcement agencies executed coordinated raids worldwide, marking the public reveal and conclusion of the operation.

It was a multi-year sting that quietly infiltrated criminal communications before culminating in one of the largest law enforcement actions ever.

 

Here's how it worked:

  • The Setup: After shutting down other encrypted phone services used by criminals (like Phantom Secure), the FBI partnered with a confidential source to develop Anom—a phone that looked like a secure device but secretly copied every message sent through it.

  • The Tech: These phones were modified versions of Google Pixel and Samsung devices, running a custom OS called Arcane OS. They had hidden back doors that allowed law enforcement to monitor communications in real time.

  • The Distribution: The FBI used trusted criminal distributors to spread the phones, making them seem like the next big thing in secure communication. Access was invite-only, which added to their allure.

  • The Trap: Criminals used Anom to coordinate drug deals, money laundering, and other illicit activities—thinking they were safe. In reality, the FBI was reading everything. Over 27 million messages were intercepted from 11,800 devices across 90+ countries, leading to 800+ arrests.

  • The operation was so successful it’s been called the biggest sting in Australian history, and it’s now the subject of books and documentaries. It also sparked debates about privacy, surveillance ethics, and the future of law enforcement tech.

Then there was the Sky ECC global bust. This was one of the most dramatic take-downs of an encrypted communication network used by organized crime. 

Here's how it unfolded:

What Was Sky ECC?

A subscription-based encrypted messaging service developed by Sky Global, founded in Vancouver in 2008.Marketed as “unhackable,” it used  elliptic-curve cryptography (ECC) and featured self-destructing messages, kill switches, and disabled GPS/microphones.

Devices were modified smartphones (BlackBerry, Nokia, Apple, Android) sold for criminal use.

The Crackdown:

Over 100 convictions in Belgium’s largest drug trial by October 2024, with sentences up to 17 years and millions in assets seized

January 2025: Spanish and Dutch authorities arrested four top distributors who managed 25% of Sky ECC’s subscriptions, earning over €13.5 million.

March 2021: Europol and law enforcement agencies from Belgium, the Netherlands, and France cracked Sky ECC’s encryption and began monitoring 70,000 users in real time.

Raids: On 9 March 2021, Belgian police conducted 200 raids, arrested 48 people, and seized €1.2 million and 17 tonnes of cocaine.

The network was so trusted that criminals openly shared execution orders, torture images, and insider financial data.

Aftermath & Arrest:

Over 100 convictions in Belgium’s largest drug trial by October 2024, with sentences up to 17 years and millions in assets seized.

January 2025: Spanish and Dutch authorities arrested four top distributors who managed 25% of Sky ECC’s subscriptions, earning over €13.5 million.

​Seizures included cryptocurrency, luxury items, vehicles, and mobile devices.

Legal Fallout:

Sky Global’s CEO, Jean-François Eap, was indicted by the U.S. DOJ for RICO violations, accused of knowingly aiding criminal enterprises.

The company denied wrongdoing, claiming it was targeted for defending privacy rights.

This operation exposed the dark side of encrypted tech and sparked global debates on privacy vs. surveillance. Want to explore how criminals adapted after Sky ECC fell—or what law enforcement learned from it?

 

 

Then came the EncroChat sting—part of Operation Venetic—was undeniably one of the most successful law enforcement operations against organized crime in Europe. But with its scale and secrecy came a swirl of conspiracy theories and legal challenges. Let’s unpack what’s real and what’s speculative.

Common Conspiracies & Claims

1. Illegal Hacking Allegations

 

  • Claim: French authorities hacked EncroChat servers unlawfully.

  • The French government has maintained that the operation was legal under its jurisdiction. UK courts have upheld the admissibility of the evidence, though some defense lawyers argue the surveillance violated privacy rights

  • EncroChat Was a Government Honeytrap

  •  

  • Claim: EncroChat was secretly created or co-opted by law enforcement.

  • There’s no credible evidence that EncroChat was a government front. It was a Dutch-based company offering encrypted phones, and its downfall came from a technical tool installed by French investigators.

  • Evidence Was Fabricated.

  •  

  • Claim: Messages were altered or wrongly linked to suspects.

  • Truth: Courts have seen cases where suspects were identified through metadata, selfies, and even GPS overlaps with personal phones. While attribution challenges exist, convictions have held up under scrutiny.

  • EncroChat Users Were Targeted Without Due Process

  • Claim: Arrests were made without proper warrants or legal basis.

  • Law enforcement agencies across Europe coordinated through Europol and followed national procedures. In the UK, over 3,000 arrests and 1,000 convictions have been secured, with courts consistently ruling the evidence admissible.

  • 5. EncroChat Wasn't That Effective

  •  

  • Claim: Despite the bust, drug trade and crime remain unaffected.

  • Truth: While the sting disrupted major networks, reports suggest Europe’s drug trade rebounded quickly. New players filled the void, and encrypted platforms evolve.

  • Legal Fallout & Ongoing Debate

  •  

  • Defense lawyers continue to challenge the legality of the hack, especially around privacy rights and cross-border surveillance.

  • Some cases have raised concerns about fair trial rights, especially when defendants couldn’t access full technical details of the hack.

  • So yes—there are real legal debates, but most of the wilder conspiracies haven’t held up. Want to explore how criminals adapted after EncroChat or what tech replaced it?

  • Operation Venetic officers first claimed that a device was attached to the EncroChat server allowing Europol and the French authorities access to the server for 10 weeks

The French Gendarmerie began investigating EncroChat in 2017 after discovering that the phones were frequently found in operations against organized crime groups. They suspected that the company was operating from servers in France.

To infiltrate EncroChat, French authorities managed to bypass EncroChat’s encryption by secretly installing a technical tool onto the platform’s servers. This tool allowed them to intercept and decrypt messages in real time before they could be fully encrypted and transmitted between users.

The hack had a significant impact on organized crime across Europe. In the UK alone, it led to the arrest of approximately 746 suspected high-ranking criminals involved in activities such as murder, gun smuggling, and drug trafficking. Authorities seized large amounts of cash and drugs as a result of the operation.

The successful infiltration of EncroChat by French authorities demonstrated the vulnerabilities even in highly secure communication networks. It also highlighted the importance of international cooperation in tackling organized crime and the effectiveness of advanced technical measures in law enforcement operations.

French authorities had access to EncroChat from April 1, 2020, when they first installed the interception tool, until June 13, 2020, when EncroChat announced it was shutting down due to the police operation. This means they had access for about two and a half months.

French authorities managed to infiltrate EncroChat by installing a technical tool on the company's servers in France. This operation, authorized by a judge in Lille at the end of January 2020, allowed them to gain access to millions of encrypted messages sent by users.

The authorities collaborated with the National Crime Agency (NCA) and Dutch police to leverage intelligence and technical expertise for this infiltration. This enabled them to monitor and investigate over 100 million messages in real-time, leading to numerous arrests and seizures across Europe.

It was a significant and complex operation that showcased the lengths law enforcement agencies will go to in order to combat organized crime.

The device installed by French authorities on EncroChat's servers allowed them to intercept and read messages in real-time without necessarily needing to crack the end-to-end encryption directly. Instead of breaking the encryption, they were able to access the messages before they were encrypted or after they were decrypted on the users' devices. This method enabled them to monitor communications without alerting the users or compromising the encryption keys themselves.

 

The Second explanation the hacking operation conducted by French authorities involved accessing messages on users' devices before they were encrypted or after they were decrypted. They achieved this by deploying malware onto the EncroChat devices themselves. This malware allowed them to capture the plain text of the messages directly from the users' devices.

Here's how they now claimed it worked: Malware Deployment:  French authorities deployed remote malware onto EncroChat devices, allowing them to covertly intercept messages before they were encrypted. This malware worked behind the scenes, secretly collecting real-time conversations, user data, and even login credentials.
The operation was incredibly sophisticated:French authorities managed to install malware on the EncroChat devices remotely. This malware was designed to operate covertly on the users' phones.

The malware intercepted messages at the endpoints—either before they were encrypted for sending or after they were decrypted upon receipt. This way, the authorities could read the messages in plain text without needing to break the encryption.

The hacking operation conducted by French authorities involved deploying malware that targeted EncroChat devices, rather than attacking the servers themselves directly. Here's a breakdown of how it worked: 

The French authorities developed and deployed malware specifically designed to be installed on EncroChat devices. This malware was covertly installed on the devices of EncroChat users.

Once installed, the malware allowed the authorities to capture plain text messages, "either" before they were encrypted for sending or after they were decrypted upon receipt. This effectively bypassed the end-to-end encryption without needing to break the encryption itself.

The malware was able to infiltrate a large number of devices across the EncroChat network, enabling the authorities to monitor millions of messages in real-time.

While the servers facilitated the deployment of the malware by hosting updates and other services, the primary attack vector was the users' devices. The servers played a supporting role in distributing the malware to the devices.

This strategy allowed law enforcement to gather valuable intelligence and disrupt criminal activities coordinated through the EncroChat network. This method allowed law enforcement to bypass the encryption process entirely and access the communications directly from the users' devices.

Now they claim: The term "device" in this context can be a bit misleading. The French authorities did not necessarily attach a physical piece of hardware to the EncroChat servers. Instead, they deployed a software-based tool or malware onto the servers.

Here's a more precise breakdown: The "device" refers to a software tool or malware that was installed on the EncroChat servers and subsequently distributed to users' devices. This tool allowed authorities to intercept communications by exploiting the servers and the devices connected to them.

This malware was deployed remotely and did not require physical access to the servers. It was designed to operate covertly on the users' phones, allowing authorities to capture messages in plain text before encryption or after decryption.

So, while no physical hardware was attached to the EncroChat servers, the software tool/malware effectively achieved the same goal of intercepting communications.

Why change tech terms. Any IT tech will tell you hardware and software are two different. Hardware is an actual device; software runs the device. Hardware is useless without software, drivers etc. Software is dangerous on its own as it can be installed on hardware after being built or manipulated to serve a purpose. But software is more easily traceable than a “Device” attached and hidden in the server?

 

Operation Venetic was a major international law enforcement operation targeting the encrypted communication platform EncroChat, which was widely used by organized crime groups. Here's a detailed overview:

EncroChat was a Europe-based company that provided modified smartphones with encrypted messaging services. It became popular among criminal organizations for coordinating illegal activities.

 

In early 2020, French and Dutch authorities, with support from Europol and Eurojust, launched Operation Venetic to infiltrate EncroChat. They installed a technical tool on EncroChat's servers, allowing them to intercept and monitor communications in real-time.

Authorities collected vast amounts of data, including millions of messages and hundreds of thousands of images. This data was analysed to identify and locate offenders.

The operation led to over 6,500 arrests and the seizure of nearly €900 million in cash, along with significant quantities of drugs, firearms, and other illegal items. It also prevented numerous threats to life by mitigating over 200 planned kidnappings and executions.

On June 13, 2020, EncroChat realized their platform had been compromised and urged users to discard their devices. The company announced it would cease operations due to the police operation.

Operation Venetic is considered one of the largest and most significant law enforcement operations against organized crime in Europe. It demonstrated the effectiveness of international cooperation in tackling encrypted criminal communication networks.

EncroChat was a Dutch-based company that specialized in selling encrypted mobile phones designed for secure communication. These devices were modified Android smartphones with features like GPS, microphones, and cameras removed, and they came pre-installed with encrypted messaging applications.


EncroChat marketed itself as a legitimate provider of secure communication, boasting 60,000 users worldwide, including 10,000 in the UK. However, law enforcement agencies later discovered that the platform was widely used by organized crime groups for illicit activities.

Yes, three Dutch men were arrested in June 2022 for allegedly making €56 million by selling EncroChat encrypted phones to criminals. The suspects—Frans S. (64), Peter H. (43), and Hassan K. (48)—were accused of money laundering and participating in a criminal organization.
Authorities believe they were key figures in distributing EncroChat devices in the Netherlands, which was the largest market for these phones. Their trial is still ongoing, with the next hearing scheduled in a few months.

Peter H., Hassan K., and Frans S. are three Dutch men accused of making €56 million by selling EncroChat encrypted phones to criminals. They were arrested in June 2022 and charged with money laundering and participating in a criminal organization.
Authorities believe they played a key role in distributing EncroChat devices in the Netherlands, which was the largest market for these phones. Their trial is ongoing, with the next hearing scheduled in a few months.
Interestingly, their lawyers argue that they were simply selling phones and did not specifically target criminals as customers. They claim that journalists, athletes, and employees of a listed company in the United States also used EncroChat devices.

These three men, whom have yet to bet to trial. There is not much about them to research. Only they sold the devices. What we should be considering is how 3 men, manufactured and sold so many Encrochat phones world wide?

Someone designed the phones, had all cameras and other traceable hardware/software removed. Then rebranded as Encrochat Encrypted phones and sold only on the black market to criminal gangs. Now if we look at Israel and the Hezbollah and Hamas pagers sting. One of the most brazen attacks through electronic devices.We have to now keep and eye on the trials of these three men. If we done they will just disappear under the radar. In reality these three alleged men are responsible for the whole Encrochat enterprise and are responsible for many high level criminals being arrested and charged with some real serious crime. They three would be dragged as witnesses to my trial if I was charged with Encrochat encrypted connections. I refuse to believe these phones were manufactured for any other reason than to entrap many involved in organized crime. Just like the Israeli pagers. Manufactured by security services to serve a purpose. They know, us as humans usually don't look into reasons why criminals are arrested me. But this intrigued me. Have a look at the Hezbollah pager, walkie talkie and electric car explosions.

How come no innocent member' of the public were injured after purchasing one of these pagers?

How come there is no research on any ordinary private person buying an Encrochat phone for their own private privacy purposes?

PAGER, WALKIE TALKIE AND ELECTRIC CAR BLASTS  AGAINST HEZBOLLAH AND  HAMAS

As you can read. The pager attack was very sophisticated. As was Encrochat. I know an Encrochat handset could not be bought on the open market. They were touted by other criminals on other encrypted applications, like telegram, whats app etc. Until  the backdoor key to Whats app was provided to law enforcement by Meta track and trace criminal behavior on their platform.

Is it beyond belief that Euro-pol, Eurojust or Interpol, NCA etc had the Encrochat system made and sat back and watched all the top criminals in Europe put themselves away. Those  devices were never cracked. The encryption was untraceable. Placed a device on the Enrochat Server? why not just take it with all details on it. download it. They claimed to have physical access. Then it was malware sent or planted by another handset user. Malware sent to the Encrochat server? Why not arrest who owned the encrypted server?

You have arrested the sellers> why not the manufacturers.

Then there's the encrypted app Telegram, why haven't you used similar techniques on breaking that encryption? Because you cant. You have had to arrest the CEO because you cannot crack that's encryption and law enforcement authorities in France , Holland, UK all have access to that server? Send the same malware, attach the same device used in Encrochat?

Telegram has similar end to end encryption to the Encrochat devices.

Law enforcement agencies have struggled to crack Telegram’s encryption because:

  • This makes it extremely difficult for law enforcement to intercept or decrypt communications.Telegram uses end-to-end encryption for its Secret Chats, meaning only the sender and recipient can read the messages—not even Telegram itself. 

  • However, Telegram’s regular chats are not end-to-end encrypted. Instead, they use server-client encryption, meaning Telegram stores messages on its servers, While this still provides security, it’s not as private as Secret Chats.

  • That said, authorities can still track metadata, such as who is communicating with whom, and they can access Telegram messages if they gain access to a suspect’s device.



 

Law enforcement initially struggled to crack EncroChat encrypted phones because they were designed with high-level security features to prevent interception. These devices had:

  • End-to-end encryption that ensured messages were only readable by the sender and recipient.

  • ​Self-destruct mechanisms, allowing users to remotely wipe their phones or delete messages with a PIN.

  • ​Dual operating systems, with one appearing as a normal Android interface and the other hidden for encrypted communication.

  • Modified hardware, including the removal of cameras, microphones, and GPS to prevent tracking

 

Hence why: Telegram's founder and CEO, Pavel Durov, was arrested in August 2024 in France. The charges included complicity in crimes such as child exploitation and drug trafficking, which allegedly occurred on the platform. French authorities accused Durov of refusing to cooperate with law enforcement by not providing encryption keys or user data.
Durov was released after paying a €5 million bail, but he was placed under judicial supervision and banned from leaving France. Telegram has maintained that it complies with EU laws and argued that the platform itself cannot be held responsible for misuse by its users.
This case sparked debates about privacy, free speech, and the accountability of tech companies.

For your research:

The term "Venetic" has multiple meanings depending on the context:

  1. Historical Language: Venetic refers to an extinct Indo-European language spoken by the Veneti people in ancient northeastern Italy and parts of modern Slovenia. It was used between the 6th and 1st centuries BCE and is known from inscriptions written in the Venetic alphabet.

  2. Operation Venetic: In modern law enforcement, "Venetic" is associated with Operation Venetic, the UK's largest-ever crackdown on organized crime. This operation involved infiltrating encrypted communication platforms like EncroChat, leading to 746 arrests, the seizure of £54 million in criminal cash, and dismantling numerous criminal networks.

In our opinion, something extremely fishy going on with Encrochat and the surrounding investigation..

 

 

 

The Security Risks of Mobile Phones

 

 

 

 

 

 

 

 

 

 

 

 

Mobile phones are an essential part of our everyday lives. They are not just for making phone calls anymore. We use them to access the internet, send text messages, take pictures and videos, and manage many aspects of our lives. However, this convenience comes with significant security risks that you should be aware of.

Privacy and Security Weaknesses

Mobile phones were not designed with strong privacy and security in mind. This means they do not do a good job of protecting your communications. Hackers can intercept your calls, texts, and online activities because many apps do not use strong enough encryption. Even apps that seem secure can have weaknesses that hackers can exploit.

Location Tracking and Surveillance

Another big risk with mobile phones is location tracking. Every smartphone has GPS capabilities that can track your movements in real-time. While this is useful for navigation and certain apps, it also means that unauthorized parties can access your location data, which can be a serious privacy threat.

One of the most significant privacy threats from mobile phones is the way they constantly broadcast your location. This can happen in several ways, but here are some of the primary methods

2. Mobile Signal Tracking 

 

Cell Site Simulator  A government or another technically sophisticated organization can  also collect location data directly, such as with a cell site simulator  (a portable fake cell phone tower that pretends to be a real one, in  order to “catch” particular users' mobile phones and detect their  physical presence and/or spy on their communications, also sometimes  called an IMSI Catcher or Stingray). IMSI refers to the International  Mobile Subscriber Identity number that identifies a particular  subscriber's SIM card, though an IMSI catcher may target a device using other properties of the device as well. The IMSI catcher needs to be taken to a particular location in order  to find or monitor devices at that location. Currently there is no  reliable defense against all IMSI catchers. (Some apps claim to detect  their presence, but this detection is imperfect.) On devices that permit  it, it could be helpful to disable 2G support (so that the device can  connect only to 3G and 4G networks) and to disable roaming if you don't  expect to be traveling outside of your home carrier's service area.  These measures can protect against certain kinds of IMSI catchers.  

 

Wi-Fi and Bluetooth Tracking

 

 Modern smartphones have other radio transmitters in addition to the  mobile network interface. They usually also have Wi-Fi and Bluetooth  support. These signals are transmitted with less power than a mobile  signal and can normally be received only within a short range (such as  within the same room or the same building), although sometimes using a  sophisticated antenna allows these signals to be detected from  unexpectedly long distances; in a 2007 demonstration, an expert in  Venezuela received a Wi-Fi signal at a distance of 382 km or 237 mi,  under rural conditions with little radio interference. Both of these  kinds of wireless signals include a unique serial number for the device,  called a MAC address, which can be seen by anybody who can receive the  signal. The device manufacturer chooses this address at the time the  device is created and it cannot be changed using the software that comes  with current smartphones. Unfortunately, the MAC address can be observed in wireless signals  even if a device is not actively connected to a particular wireless  network, or even if it is not actively transmitting data. Whenever Wi-Fi  is turned on on a typical smartphone, the smartphone will transmit  occasional signals that include the MAC address and thus let others  nearby recognize that that particular device is present. This has been  used for commercial tracking applications, for example to let  shopkeepers determine statistics about how often particular customers  visit and how long they spend in the shop. As of 2014, smartphone  manufacturers have started to recognize that this kind of tracking is  problematic, but it may not be fixed in every device for years—if ever. In comparison to GSM monitoring, these forms of tracking are not  necessarily as useful for government surveillance. This is because they  work best at short distances and require prior knowledge or observation  to determine what MAC address is built into a particular person's  device. However, these forms of tracking can be a highly accurate way to  tell when a person enters and leaves a building. Turning off Wi-Fi and  Bluetooth on a smartphone can prevent this type of tracking, although  this can be inconvenient for users who want to use these technologies  frequently. Wi-Fi network operators can also see the MAC address of every device  that joins their network, which means that they can recognize particular  devices over time, and tell whether you are the same person who joined  the network in the past (even if you don't type your name or e-mail  address anywhere or sign in to any services). On a few devices, it is physically possible to change the MAC address  so that other people can't recognize your Wi-Fi device as easily over  time; on these devices, with the right software and configuration, it  would be possible to choose a new and different MAC address every day,  for example. On smartphones, this commonly requires special software  such as a MAC address-changing app. Currently, this option is not  available for the majority of smartphone models.  

 

Location

 

Information Leaks From Apps and Web Browsing  Modern smartphones provide ways for the phone to determine its own  location, often using GPS and sometimes using other services provided by  location companies (which usually ask the company to guess the phone's  location based on a list of cell phone towers and/or Wi-Fi networks that  the phone can see from where it is). Apps can ask the phone for this  location information and use it to provide services that are based on  location, such as maps that show you your position on the map.

 

Some of these apps will then transmit your location over the network  to a service provider, which, in turn, provides a way for other people  to track you. (The app developers might not have been motivated by the  desire to track users, but they might still end up with the ability to  do that, and they might end up revealing location information about  their users to governments or hackers.) Some smartphones will give you  some kind of control over whether apps can find out your physical  location; a good privacy practice is to try to restrict which apps can  see this information, and at a minimum to make sure that your location  is only shared with apps that you trust and that have a good reason to  know where you are. In each case, location tracking is not only about finding where  someone is right now, like in an exciting movie chase scene where agents  are pursuing someone through the streets. It can also be about  answering questions about people's historical activities and also about  their beliefs, participation in events, and personal relationships. For  example, location tracking could be used to try to find out whether  certain people are in a romantic relationship, to find out who attended a  particular meeting or who was at a particular protest, or to try and  identify a journalist's confidential source. The Washington Post reported in December 2013 on NSA  location-tracking tools that collect massive amounts of information “on  the whereabouts of cellphones around the world,” mainly by tapping phone  companies' infrastructure to observe which towers particular phones  connect to when.

 

A tool called CO-TRAVELER uses this data to find  relationships between different people's movements (to figure out which  people's devices seem to be traveling together, as well as whether one  person appears to be following another).  Turning Phones off   There's a widespread concern that phones can be used to monitor  people even when not actively being used to make a call. As a result,  people having a sensitive conversation are sometimes told to turn their  phones off entirely, or even to remove the batteries from their phones. The recommendation to remove the battery seems to be focused mainly on the existence of malware  that makes the phone appear to turn off upon request (finally showing  only a blank screen), while really remaining powered on and able to  monitor conversations or invisibly place or receive a call. Thus, users  could be tricked into thinking they had successfully turned off their  phones when they actually hadn't. Such malware does exist, at least for  some devices, though we have little information about how well it works  or how widely it has been used. Turning phones off has its own potential disadvantage: if many people  at one location all do it at the same time, it's a sign to the mobile  carriers that they all thought something merited turning their phones  off. (That “something” might be the start of a film in a movie theater,  or the departure of a plane at an airport, but it might also be a  sensitive meeting or conversation.) An alternative that might give less  information away is to leave everybody's phone in another room where the  phones' microphones wouldn't be able to overhear the conversations.  

 

Burner Phones

 

  Phones that are used temporarily and then discarded are often  referred to as burner phones or burners. People who are trying to avoid  government surveillance sometimes try to change phones (and phone  numbers) frequently to make it more difficult to recognize their  communications. They will need to use prepaid phones (not associated  with a personal credit card or bank account) and ensure that the phones  and SIM cards were not registered with their identity; in some countries  these steps are straightforward, while in others there may be legal or  practical obstacles to obtaining anonymous mobile phone service. There are a number of limitations to this technique. First, merely swapping SIM cards or moving a SIM card  from one device to another offers minimal protection, because the  mobile network observes both the SIM card and device together. In other  words, the network operator knows the history of which SIM cards have  been used in which devices, and can track either individually or both  together. Second, governments have been developing mobile location  analysis techniques where location tracking can be used to generate  leads or hypotheses about whether multiple devices actually belong to  the same person. There are many ways this can be done. For example, an  analyst could check whether two devices tended to move together, or  whether, even if they were in use at different times, they tended to be  carried in the same physical locations.

A further problem for the successful anonymous use of telephone  services is that people's calling patterns tend to be extremely  distinctive. For example, you might habitually call your family members  and your work colleagues. Even though each of these people receive calls  from a wide range of people, you're likely the only person in the world  who commonly calls both of them from the same number. So even if you  suddenly changed your number, if you then resumed the same patterns in  the calls you made or received, it would be straightforward to determine  which new number was yours. Remember that this inference isn't made  based only on the fact that you called one particular number, but rather  on the uniqueness of the combination of all the numbers that you  called. (Indeed, The Intercept reported  that a secret U.S. government system called PROTON does exactly this,  using phone records to recognize people who placed phone calls in a  “similar manner to a specific target” from new phone numbers.) An  additional example can be found in the Hemisphere FOIA document.  

The document describes the Hemisphere database (a massive database of  historical call records) and how the people who run it have a feature  that can link burner phones by following the similarity of their call  patterns. The document refers to burner phones as "dropped phones"  because their user will "drop" one and start using another one—but the  database analytics algorithms can draw the connection between one phone  and another when this happens, so long as both were used to make or  receive calls to similar sets of phone numbers. Together, these facts mean that effective use of burner phones to  hide from government surveillance requires, at a minimum: not reusing  either SIM cards or devices; not carrying different devices together;  not creating a physical association between the places where different  devices are used; and not calling or being called by the same people  when using different devices. (This isn't necessarily a complete list;  for example, we haven't considered the risk  of physical surveillance of the place where the phone was sold, or the  places where it's used, or the possibility of software to recognize a  particular person's voice as an automated method for determining who is  speaking through a particular phone.)  

 
The Scottish Child Abuse Inquiry shortcomings are a danger to survivors

 

 

 

 

 

 

 

 

 

​​

 

 

 

 

 

 

Unveiling the Scottish Child Abuse Inquiry: A Barbaric Treatment of Victims.

Underbelly True Crime

 

The Scottish Child Abuse Inquiry, established in 2015, was intended to provide justice to survivors of historical abuse in various Scottish institutions. However, its treatment of victims throughout the process has been nothing short of shocking, endangering the pursuit of truth and justice. This scathing assessment aims to expose the failures and shortcomings of the inquiry, highlighting the continued mistreatment of survivors and the unjust practices that have marred its reputation.

 

1. Insufficient Support for Victims:

The Scottish Child Abuse Inquiry has demonstrated a severe lack of empathy and consideration for the well being of survivors. Numerous victims have spoken out about the insufficient support provided, forcing them to relive their traumatic experiences without adequate emotional assistance or counseling. This neglect not only disregards their mental health but also undermines their ability to effectively present their experiences, damaging the prospects of a fair investigation.

 

2. Delayed Justice and Evasive

Tactics:

 

The inquiry's proceedings have been plagued by excessive delays, bureaucratic red tape, and evasive tactics, showcasing a disregard for survivors seeking justice. Time and time again, victims have been left waiting for their voice to be heard, only to be met with avoidable obstacles that hinder the progress of the inquiry. Such delays not only prolong the suffering of survivors but also allow perpetrators to escape accountability.

 

3. Limited Scope and Incomplete Investigation:

 

The Scottish Child Abuse Inquiry has faced criticism for its narrow focus, limiting the investigation to specific institutions and excluding others. By failing to broaden its scope and include all relevant organizations, the inquiry inadvertently perpetuates an incomplete understanding of the systemic failings that enabled child abuse to occur. This selective methodology risks shielding certain perpetrators, preventing a comprehensive and impartial examination of the issue.

 

4. Lack of Transparency and Public Confidence:

 

Transparency is crucial for an inquiry of this magnitude; sadly, the Scottish Child Abuse Inquiry has fallen short in this regard. The lack of openness and clear communication has eroded public confidence in the inquiry's ability to fulfill its objectives and deliver justice. The limited accessibility of information, repeated clashes with stakeholders, and inadequate explanation of decisions have all contributed to a profound sense of distrust among survivors and the wider public.

 

5. Damaged Reputations and Lost Opportunities:

 

The Scottish Child Abuse Inquiry's mishandling of victims' testimonies and its overall treatment of survivors have damaged its own reputation and created an environment of mistrust. Many victims have withdrawn from the inquiry due to the distress and re-traumatisation they experienced during proceedings, which can only be seen as a failure on the part of those responsible for ensuring a safe and supportive environment for survivors. The lost opportunities for genuine healing and justice for victims are inexcusable and further reflect the inquiry's ineptitude.

 

Conclusion:

 

In its handling of victims and survivors, the Scottish Child Abuse Inquiry has demonstrated a shocking disregard for their welfare and pursuit of justice. The lack of appropriate support, delayed justice, limited scope, and evasive tactics all contribute to a deeply flawed process. It is imperative that swift action is taken to rectify these shortcomings, providing survivors with the respect, dignity, and support they deserve in their quest for justice and healing. Only then can the inquiry truly serve its purpose and deliver on its promises .

We also discovered that during the Scottish Child Abuse Inquiry, Glasgow Social Work Department severely let down the survivors of historic abuse while they were in the care of the Scottish Government

Their record keeping and the deletion of records that proved the abuse, were  either deleted or destroyed. Same with their care home records. The staff at Larch-grove remand home names have been deleted. However ,if it wasn't for other survivors remembering the same names of staff then there would be no case to answer. This alone prevented myself from going civil with my case and the redress scheme is not fit  for purpose. In my opinion. been nothing less than disgusting.

 

This Inquiry has brought to light a dark chapter in Scotland's history, exposing the systematic abuse suffered by vulnerable children in various care institutions in Scotland

While children that were abused were reporting this abuse to their Social Worker. No investigation or protection was offered to the children. Instead, the reports were destroyed, deleted or hidden.

Underbelly True Crime discovered this was to protect the abusers and said institutions from prosecution and to cut the workload in the Social Work Department - states one ex Glasgow Social Worker:

 

The first major failing of the Glasgow Social Work department revealed during the Scottish Child Abuse Inquiry was its inadequate investigation into allegations of abuse. Overwhelming evidence has shown that numerous instances of abuse were reported to social workers by children, but action was either delayed or failed to materialize entirely. The department's inability to respond promptly and effectively exacerbated the suffering of vulnerable children, allowing abusers to continue their heinous actions with impunity.

Further exacerbating the problem, even when social workers acknowledged the abuse, the Glasgow Social Work department often prioritized preserving the reputation of the perpetrators or institutions over the safety and well being of the children. This systemic failure demonstrates a flawed approach that favored protecting abusers and their organizations rather than ensuring justice and safeguarding the victims.

Another concerning aspect of Glasgow Social Work department's response to the inquiry has been its failure to adequately support survivors of abuse.

The testimonies provided by survivors during the inquiry have shed light on the lack of appropriate support and care they received following their traumatic experiences.

Survivors have highlighted the insensitivity and lack of empathy displayed by social workers. Many reported feeling dismissed, ignored, or re traumatized by the department's handling of their cases. This failure to provide necessary support to survivors, particularly when they were at their most vulnerable, not only perpetuated their pain and suffering but also undermined their trust in the system designed to protect them.

The lack of accountability and transparency within the Glasgow Social Work Department (Pollok) is yet another area where significant failings have been uncovered. The inquiry has revealed alarming instances where crucial evidence was withheld or destroyed, hindering the pursuit of justice for survivors and the identification of those responsible for the abuse.

Additionally, the department's resistance to external scrutiny and a culture of secrecy has eroded public trust and confidence in its ability to tackle child abuse cases effectively. Without meaningful accountability and transparency, it becomes difficult to identify the root causes of the department's shortcomings and implement the necessary reforms to prevent such failures from reoccurring.

Addressing the failings of the Glasgow Social Work department and ensuring justice for survivors is the only way forward and it requires comprehensive reform from Glasgow Social Work Department to safeguard vulnerable children effectively. It is crucial to establish robust processes for the investigation and protection of children, ensuring social workers are equipped with the necessary resources and training to respond promptly and effectively to abuse allegations.

Moreover, survivors deserve a compassionate and supportive response from social workers, with trauma-informed practices at the forefront. Adequate funding and resources must be allocated to provide ongoing support services for survivors, fostering healing and recovery.

Furthermore, establishing clear lines of accountability and promoting transparency within the department will help restore public trust in its ability to protect children from abuse. This necessitates independent oversight of inquiries and investigations, ensuring all actions taken are impartial and aim to deliver justice.

The failings of the Glasgow Social Work Department during the Scottish Child Abuse Inquiry are deeply troubling and indicative of a significant disservice to vulnerable children and survivors of abuse. A new department with robust reforms, greater accountability, transparency, and improved support services for survivors is urgently needed to rectify this department's repeated failings and prevent such atrocities from occurring in the future. Only through these measures can we hope to provide justice and healing for survivors and eradicate the systemic failures that allowed the re-abusing of the abused to persist for far too long, with some survives suffering from re-abuse longer than the original abuse they received while in the care of the Scottish Government.

How can this be right?

Then we have our current SNP government. Who may I say, "screamed" for this Inquiry to take place. They buy Motor homes, Christmas gifts, shop Amazon, just spend our countries money on what they want. While us survivors sit back and watch them enjoy themselves as much as the care home staff who carried out the abuse had done so many years ago.

A free for all, back then on children, modern day on other people's money.

That's our Government for you.

What about the survivors?

The PR spin has now gone. Now it's a hindrance to them, well us survivors are a hindrance to them. A piece of shit on their shoes, as we have always been. Sad, but true.

Larchgrove Reman Home

Also Featured In

amazon.

    Like what you read? Donate now and help us provide fresh news and analysis for our readers   

Donate with PayPal

(C) Underbelly Magazine 2018

bottom of page