Blog

Should kids have smartphones?

by George Lovell | | 0 comments

Most kids get their first smartphone between the age of 10 and 12.

We sell a lot of phones to parents in July and August, just before their kids start secondary school (age 12), but it has become much more common to see kids as young as 10 with iPhones. Up until that point, they're quite happy just having an iPad to play games and watch YouTube on, but when other kids in their class start Snapchatting, it's only natural that they want to join in.



Is this a bad thing?

Smartphones provide quick, easy, "cheap" dopamine, i.e. reward without effort (never good), and this makes them addictive but ultimately unfulfilling. These effects can be mitigated by limiting total exposure and ensuring that they do not displace the activities and behaviours that we know are critical for our health and well-being: exercise, time in nature, and social connection.

A smartphone is about as close to necessary a possession as there is. To not have a smartphone in 2023 is to risk being disconnected from others and failing to develop the technological prowess and relevant soft skills required to function in modern society. At the very least, it'd be damn inconvenient, and this is especially true for younger generations.

In my opinion, the benefits of technology significantly outweigh the costs at an aggregate level, and I believe this to be true in the case of smartphones - for most people in most cases. Mindfully and appropriately used, our smartphones enable us to be more productive, better connected, and less stressed. What constitutes mindful and appropriate usage is debatable and will vary from person to person.

It's far more difficult to make this case for social media - in young people at least.

The average age when signing up for a social media account is 12.6 years.

There's sufficient evidence to suggest that social media platforms are the leading contributor to the current mental health crisis among teens. Reports of anxiety, depression and suicide attempts amongst young people began skyrocketing in the early 2010s, just as social media was becoming increasingly popular and accessible.



The data continues to prove that social media usage correlates with poorer mental health outcomes and that the magnitude of this effect is notably greater in teens.



Moderate use appears to have little effect whereas heavy use has a disproportionately greater effect. Mood disorders are more prevalent in girls than boys. For boys, depression rate doubles as daily use increases from 2 to 5 hours. For girls, it triples.



Would you let your child vape?

Perhaps it was to be expected that your teenager would sneak a cigarette or a beer when they get the chance, and that this was seldom cause for concern. By contrast, the notion of carrying and hitting a vape all day every day should be far more alarming to a parent. But what if we are glossing over a device that is far more addictive and destructive than a vape?

Our prefrontal cortex - the part of the brain associated with complex decision-making and impulse control - isn't fully developed until around 25. This is why we restrict freedom of choice by law and as parents until the age of 18.

It's so obvious that children shouldn't have unlimited access to drugs, pornography and gambling.

Can we really expect a child to exercise appropriate caution with respect to social media use?

In my experience and observation, practising healthy habits and moderation is difficult enough as an adult, so it'd be unfair to expect a child to make informed and responsible decisions for themselves, especially when you pile peer pressure on top of the issue: no teen wants to be the one weird kid without it, and no adult wants to be the one out-of-touch, overly-strict hippy parent that doesn't allow it.

I believe that an adult should be able to do whatever they want - within reason and provided it doesn't cause harm to others. At what age we draw the line and for what choices are open to debate, but I think that based on the mountains of data and anecdotes, there should be strict age and/or usage limits on social media use specifically. Not doing so is setting up a non-trivial proportion of young people for failure.

I think that smartphones (and computers) are essential items for just about everyone including teens, and whilst they have their dangers and downsides, we can mitigate them with appropriate supervision and education, and this should be the role of parents and teachers.

You remember how "growing up" was at times really difficult and disconcerting. Technology has only made it more challenging, which is why we should sympathise with the youth of today, even though we might not understand them. Younger generations are the first to grow up with an iPad. We don't know exactly what effect this has, but it's clearly a significant one - one that has the potential to damage malleable minds, which could lead to greater societal problems.

It's important to note that these issues relate to internet access and that the smartphone is just a vessel for this.

Smartphones and social media are not synonymous. You cannot blame the technology for ruining a generation. It's how we use it that matters. For the most part, we are all subject to the same benefits, challenges and risks, regardless of age. Social media is an outlier because it has a disproportionately harsher and wider effect on young people.

Thanks for reading!

See Our Blog for the latest industry news, tech tips, company updates, and anything else we feel like writing about. 

     

Back to chemistry class

by George Lovell | | 0 comments

Your iPhone, laptop, Xbox & Fitbit would not exist without these elements. 

Check out the full list here



Platinum, Silver, Nickel, Fibreglass and many other magical materials are also required to create the modern miracles that we call circuit boards. It's quite amazing that we've figured out how to precisely combine so many tiny materials to create insanely powerful computing. From shiny bits of ore from a hole in the jungle, to the iPhone in your pocket. If that isn't alchemy...

We're often asked if old phones have value in precious metals. The answer is yes: if you can separate, melt down and cleanly extract all the materials from your phone, you'll have about 1 dollars worth!

Check out our sources for further reading:



For a deep dive into the mining, manufacturing and economics behind what goes into your device, check out this article from Vice.

If you're into the chemistry, the National Museum of Scotland will sort you out.

Thanks for reading!

See Our Blog for the latest industry news, tech tips, company updates, and anything else we feel like writing about. 

     

Does the world need a faster MacBook?

by George Lovell | | 0 comments

Apple recently announced the M3 chip series. A top-spec M3 MacBook Pro with M3 Max boasts 92 billion transistors, a 16-core CPU, a 40-core GPU, and 128GB Unified Memory (RAM that's built into the CPU).

Impressive. Powerful.

Granted, the average person doesn't know let alone care how many transistors, cores or gigabytes their device has. They care about how it performs overall - that it runs all their programs smoothly.


Very few people will dig into the specs and benchmark tests. Even fewer will truly understand them beyond: this year's numbers greater than last year's numbers.

"Scary Fast", as it was named in their Halloween launch event, is a lot more appealing and relatable to the average Joe.

All you need to know is that one of these devices should be able to handle just about any task that you throw at it.

Mac has never really been a viable option for gamers when shopping for a PC. Apple has recently (finally) become interested in gaming - as they should, given that the video game sector is larger than the movie and music industry combined, and is projected to be worth $321 billion by 2026.

It's hard to envision an M3 Mac experiencing any difficulty running a complex, demanding game. It's equally hard to envision the PC gaming community fully embracing Apple. In either case, we'll have to wait and see.

It's becoming increasingly less important to take specs into account when purchasing a new device. The vast majority of users needn't split hairs between different types of processors, assuming they are opting for a high-end £600+ device over a mid-range alternative. Just as the vast majority of drivers aren't concerned with how a vehicle's gear ratio or aerodynamics will influence its top speed. Any half-decent phone or computer will run Netflix and Snapchat without getting out of first gear.

Why would you ever need anything which performs above and beyond your requirements? The most efficient product in any given situation is the one that precisely meets but does not exceed your current requirements. You needn't second guess what you currently have unless it's hindering your productivity or enjoyment.

So is there any point? Well, having a fast car is cool, even if you never take it over 90mph. It's better to have more power than you need, and it is there in case you need it. It represents progress, which is what we all strive for in living a fulfilling life as individuals and making the world a better place collectively. Being part of this feels good.

The M3 isn't much of a step up from the M2, because you simply cannot make that much progress from year to year. It'd be quite difficult to justify upgrading from an M1 to an M3. If you're currently using an old Intel Mac however, then this would be a significant enough upgrade, and you wouldn't have to worry about upgrading again for several years. Law of contrast: a big upgrade is always more exciting and satisfying than a small, incremental one.

Some people will mock or criticize a company for headlining its new product's slightly darker shade of paint, proclaiming it to as an insufficient or sneaky trick to increase sales at higher prices. In actuality, there usually are some real and meaningful improvements to the product itself. The millions of dollars spent by a huge team of marketing experts determined that consumers would have a more favourable response to a darker shade of paint than to actual performance upgrades. People really care that it looks good, and paint isn't subject to scientific or economic constraints, so why not pull on that disproportionately long lever as much as possible?


Every ad that you see has been ruthlessly split-tested and refined to induce an optimum combination of desire, craving, jealousy, insecurity and anxiety in the largest proportion of consumers possible. It's just effective marketing.

Technology can be pretty difficult to explain or comprehend. "Our best camera yet" and "our fastest chip ever" are pretty clear-cut. The people who want to geek out (or argue) over computer specs will have no trouble finding them in their online geek communities.

Sharron on the other hand, just needs to think that the new MacBook will make her more productive and popular. If you can afford a £1700 MacBook Pro M3 to watch YouTube and play solitaire on, then have at it. I think most of us would rather cruise the middle lane at 60 in a Lambo than a Fiesta. The difference here is that an M3 Mac will be common and affordable in a decade. That's the beauty of tech.

Literally any device or piece of technology that you use today was, at some point and perhaps not too long ago, highly sought after and indeed considered cutting-edge.

These tools enable more artists and professionals to create more and better music, movies, art, and other content that you enjoy. Technology is the driving force behind advances in engineering, manufacturing, healthcare, and just about every major industry. And it only truly thrives in a competitive market. Even if you stay far from the cutting-edge; whether it excites you or not, its benefits will trickle down to you in time.

You might not need a faster Mac, but the fact that it exists will certainly benefit you.

Thanks for reading!

See Our Blog for the latest industry news, tech tips, company updates, and anything else we feel like writing about. 

     

Stop worrying about AI

by George Lovell | | 0 comments

People are scared about the future of AI and super-advanced technology.

A recent poll reported that 42% of respondents fear that AI will replace their job, and that 47% of people felt extremely nervous that AI would progressively destroy the human world and that there are very significant risks associated with using AI. 24% of them expressed an angry sentiment against AI and its applications.

That's a lot of fear and negativity. We don't like that, so we've put together a case for why you probably shouldn't worry too much about future technologies.

"The robots are stealing our jobs!" - People, since 1920. Many large companies have announced plans to discard thousands of employees in coming years. People are losing their jobs to tech, but that same tech is simultaneously creating new job roles at an equal or greater rate. This is nothing new: 60% of current jobs did not exist in 1940. Historically, technology has always been additive. When the Polaroid camera was invented, artists panicked that no one would buy their paintings anymore, but have you seen the price of a da Vinci portrait? Humans provide value in a way that technology cannot. We have and always will find ways to co-exist with technology.



AI has been used to create deep fakes, internet trolls and scams. AI has also been used to track down and convict paedophiles and to diagnose early-stage cancers. On balance, It's surely been a net positive for humanity so far. Could the balance shift? Yes. But why assume that it will? We are (mostly) aware of the negative consequences of technology that we use today, but would you ditch your iPhone entirely to negate them? Didn't think so.

There's a good chance that the world will be a very different place by the end of the decade, or the next decade; or certainly at some point before 2050. This can be a startling realisation, but take solace in the fact that the present day is and always has been the best day to be alive. Lifespan, poverty, crime, freedom and peace have all been trending in the right direction since records began. Yes, we have blips - but by all accounts, and measured over a sufficient period of time, things keep getting better. Humans - our team - have applied their intelligence across various domains: science, engineering, art, philosophy etc., and this has transformed us from sick and scrawny tribes of subsistence farmers living out of mud huts to the rich and glorious cities we live in today, all in a matter of 4000 years. We're not going to suddenly start regressing. Technology is a multiplier on that same intelligence which took us from mud huts to skyscrapers; so if anything, technology should facilitate a much faster rate of progress.



There have always been doomsayers. How many times have they been proven right? Famine, drought, poverty and conflict are always just around the corner, but rarely does it materialise in such dramatic and disastrous fashion. Sensational claims quickly dissipate and are forgotten - that is until the next trendy prophecy emerges. Our negativity bias makes it such that stories of doom and gloom are more attractive to our primitive little brains. The media, through both human journalists and computer algorithms, capitalises on our insatiable appetite for catastrophe and conspiracy by driving such stories to the top of our feeds. A broken clock might be right twice per day, but 99 times out of 100, the simple, rational, boring narrative triumphs.

The future of tech is anyone's best guess.

Experts always disagree, and they are just as subject to bias as the rest of us. Remember that most experts have dedicated tens of thousands of hours to studying their specific field, yet they often arrive at vastly different conclusions. Take minimum wage for example: as you sift through news sources and blog posts you'll find overwhelming evidence in the form of scientific studies and expert consensus that proves beyond reasonable doubt that raising the minimum wage will result in job loss and economic damage. But dig a little deeper, and you will also find overwhelming evidence in the form of scientific studies and expert consensus that proves beyond reasonable doubt that raising the minimum wage will result in decreased unemployment and economic growth. This is outlined brilliantly in section II of my favourite blog post on the internet.



So whose opinion do you trust? You don't. Even the most evidence-based theories as presented by the world's leading experts can be refuted with equally "strong" evidence by opposing experts. This is before the raw data has been cherry-picked, misconstrued, manipulated to fit a pre-existing belief or agenda, and neatly packaged in a dumbed-down yet compelling format which makes you feel like an expert with less than 2 minutes of work. If economists can get it so wrong in a field that's existed for 250 years, then you probably shouldn't put all of your trust in one single expert working in an infinitely complex field that's almost unrecognisable from one year to the next. Anyone who stands by their conviction with absolute certainty should be avoided.

Take any 19th or 20th-century genius - they were wrong about most things we know to be true today. Consider this: It is 100% impossible to create a list of things that haven't occurred to you, yet that list would be 100x longer than any other list that you could make right now. That's true for every single person, all the time, including experts. We don't know what we don't know, and we never will.

Making predictions is more difficult than we like to imagine. How often is the weather forecast bang-on? How much money have you made from betting on football games? Sometimes we get lucky, but time and time again, we fail to make accurate short-term predictions despite having an abundance of data. To expect any one person to accurately predict the trajectory of a super-advanced technology and its social and economic implications over a 10-year span is quite unreasonable. You can go back and find some pretty laughable predictions made by world-leading experts such as Steve Jobs, Bill Gates and Elon Musk. This prediction gap will only increase as technology becomes more advanced because it becomes increasingly difficult (impossible) for any one person to understand it. So you may as well resign to the fact that even the experts - who know far more than you - will be mostly wrong in their predictions.

You don't know how it works. At best, you have an extremely vague understanding of advanced technologies and their implications, which you've cobbled together based on your favourite writing from those with a slightly less vague understanding. Dedicate every waking moment of your life to understanding something like AI and you'll still be way behind the curve. If it interests you, that's great, but don't treat any one opinion as gospel. Remain open to opposing viewpoints and possibilities.



Stoicism teaches us that we have power over our own minds - our opinions, judgements and attitudes - but not over outside events. Understanding and accepting exactly what is under our power is the key to leading a happy life. As Epictetus said, “The more you seek to control external events, the less control you will have over your own life.” A very tiny subset of the population has any meaningful control over how AI and other technologies will impact humanity. You have none. Any time and energy spent worrying about technology is sure to be wasteful and unproductive.

Fire is a really useful tool. It can also burn a city to the ground. That's why we have a fire service, firemen, fire extinguishers, fire doors, fire alarms and fire drills. We invented these things so that we could safely and effectively utilise fire without getting burned. Assuming we put the correct fire-door-equivalents in place for technology, we should be able to mitigate its potential dangers.

We don't see fire itself as good or evil, but whether it is utilised for good or evil can be entirely ascribed to human intent. Unfortunately, there will always be evil humans. Like tech, fire is neutral. How human utilise it will ultimately determine our fate. So perhaps it's not the tech that we should be afraid of, but the humans wielding it.



"The computer is a bicycle for our minds" is one of my favourite Steve Jobs quotes. Perhaps AI and other cutting-edge technologies are more akin to a rocket ship for our minds. We're travelling at greater speeds than ever before, which is kind of scary. Human ingenuity has got us this far without blowing up, so whilst we shouldn't be complacent, there's little evidence that we won't figure out what we need to as we go. We make mistakes, and we often look back, both at ourselves and society, and wonder how the hell we got it so wrong. Like watching a toddler learning to paint: it's a messy, sometimes painful, but altogether essential learning process. Generally, we humanoids figure it out, course correct, and come out ahead.

It'd be foolish to completely disregard the potential negative consequences of AI and other technologies, and we should all prepare accordingly with the information and intuitions that we do have. There's no doubt that it will bring challenges and opportunities, and that it will impact everyone differently. We must also recognise that on an individual level we have effectively zero control or predictive power over how it will pan out, so to worry about it is futile and only creates unnecessary stress. Instead, take a moment to appreciate that it's a wonderful and exciting time to be alive. You might just feel a little calmer as the robots incinerate your home.


Thanks for reading!

See Our Blog for the latest industry news, tech tips, company updates, and anything else we feel like writing about. 

     

Cold out there mind init?

by George Lovell | | 0 comments

Small talkin' 'bout the weather is as British as tea and crumpets.

Not everyone watched the rugby. Not everyone got stuck in traffic this morning. But rest assured that we are all experiencing the exact same weather conditions simultaneously. It's quick, easy and relatable. It's not controversial or debatable - provided it doesn't spark a discussion on climate change.

One survey found that Brits speak about the weather three times per day for a total of 9-10 minutes, which adds up to four and a half months across a lifetime.

That's a lot of weather talk. Sure beats Covid talk though.



Why bother?

Polite conversation on trivial matters simply allows the speaker to convey friendly intentions and a desire to engage in a positive interaction. We can signal and gauge each other's current moods without speaking it explicitly. Such conversational foreplay lays the foundation for deeper, more important topics of discussion. Sometimes it just fills an awkward silence, which can feel like social rejection, which can induce a state of mild panic.

You might get a bit fed up with having the same damn chat over and over today, so just remember that it serves a purpose and that we're all freezing our asses off in the same boat because we can't afford to turn the heating on - there's a hot-n-spicy conversational stepping stone for ya.

Thanks for reading!

See Our Blog for the latest industry news, tech tips, company updates, and anything else we feel like writing about.