Learn Colors with Oddbods
Oddbods Funny Cartoon
-------------------------------------------
💮☣Иностранец реагирует на SIDxRAM | LIVE @ STUDIO 21 - Duration: 12:57. For more infomation >> 💮☣Иностранец реагирует на SIDxRAM | LIVE @ STUDIO 21 - Duration: 12:57.-------------------------------------------
Railway Department Offers 10 Lakh Insurance | IRCTC Latest Updates 2018 | IRCTC Insurance Scheme - Duration: 2:50.Railway Department Offers 10 Lakh Insurance
IRCTC Latest Updates 2018
IRCTC Insurance Scheme
-------------------------------------------
O que leva alguém à desonestidade? - Duration: 5:25. For more infomation >> O que leva alguém à desonestidade? - Duration: 5:25.-------------------------------------------
I Love It EXCEPT it's just Adele Givens being Adele Givens - Duration: 0:33.The trilogy the world never wanted has come to a thrilling end.
-------------------------------------------
GOP prepares to destroy victim of assault to put Kavanaugh on SCOTUS - Duration: 3:33. For more infomation >> GOP prepares to destroy victim of assault to put Kavanaugh on SCOTUS - Duration: 3:33.-------------------------------------------
Brainwashing & Mind Control - Duration: 28:23.So far, no technology exists that can control what we believe.
Or so we believe...
So our topic today is brainwashing and mind control, and we're going to look at some
of the technologies that might be used for it or to defend against it, but we'll also
be looking at some of the implications this might have on civilizations far away in space
and time.
Mind Control scares us for the obvious reason that it's an invasion of the most extreme
and revolting nature; but it's also so frightening because politically and scientifically speaking,
it's also so plausible.
We don't really go through life fearing that murder or theft will be legalized in
the future, but fear of a society in which everyone has been brainwashed is one of the
most common themes of science fiction.
Why?
In part because it's something we already have to deal with constantly.
Simply by existing in society we are constantly subject to attempts to manipulate, sway, or
indoctrinate us.
Some level of it is necessary because children need to be educated in proper behavior not
just academic knowledge.
But we know all too well that privacy and freedom can be eroded away for even apparently
benevolent reasons, and we know there's no shortage of non-benevolent folks too.
So we could end up with a dystopian nightmare, again something common in science fiction
and nowhere better done than in George Orwell's novel "1984", a terrifying story set in
a dystopia of constant surveillance and indoctrination.
Not many scifi novels have such a huge impact that they make a permanent impression on society,
but even 70 years after its publication, a reference to 1984 or the term "Orwellian"
or "Big Brother" brings an image of utter totalitarian control to mind, regardless of
whether or not one has read the book.
If you haven't, I'd certainly recommend it.
You can pick up a free copy of 1984 today, and also get a 30-day trial of Audible, just
use my link, Audible.com/Isaac or text Isaac to 500-500.
Let's take a moment to better define what we mean by Mind Control and what types of
techniques need to be considered.
There are of course levels to the intrusiveness of mind control.
At the lowest level, there's simple influence, like when parents, educators, journalists,
and others simply show and tell you things aimed at getting you to view things a certain
way; if those people are your only sources of information, their influence will shape
your thinking for a long time, even after you're away from them.
When that influence becomes more direct and is aimed at shaping your political behavior,
it morphs into propaganda.
Subliminal methods is quite a broad category, and it includes some well-proven techniques
used in movies and advertising and by persuasive speakers; what they all have in common is
that they exploit the fact that we can only consciously process part of the information
we take in, and much of it gets processed only unconsciously.
We're getting into sinister territory with conditioning and aversion therapies, which
are called brainwashing when they're applied forcibly, although stories like A Clockwork
Orange explore its benign, socially beneficial uses.
Then there's neuro-hacking, where we directly alter your thinking, using either neurochemicals
or nano-bots, by reconfiguring the neurons you think with.
At the highest level of mind control, a new species could be engineered, or an existing
one re-engineered, to simply possess or lack the cognitive traits of interest or of concern.
There's no need to police or even forbid activities that no one is inclined to do.
Obviously the categories in this loose hierarchy overlap quite a bit, and even there being
six of them instead eight or twenty is a bit arbitrary.
Most of the methods we'll discuss today arguably match more than one of these descriptions.
In a society where mind control is ubiquitous and successful, you wouldn't really need
a draconian police state or constant surveillance.
There's no need to hunt for rebels if no one rebels.
And indeed, citizens will surveil one another; if their neighbor expresses an anti-societal
thought, they will render assistance to him by contacting the authorities, the same way
you or I would call an ambulance if our neighbor fell off a ladder.
It's not betrayal, and they're not choosing loyalty to the state over their friendship
with him, they're doing him a favor, and he'll thank them sincerely when he gets
home from his brain scrubbing session.
After all, who doesn't want a nice squeaky clean brain?
Not a very dystopian civilization on the surface.
In fact, the really disturbing thing is that it might appear incredibly Utopian.
It's likely everyone would be brainwashed, even if only for things as simple as conditioning
to keep them from injuring anyone except in desperate self-defense and to be courteous
and not to steal—behaviors we already do our best to indoctrinate people into.
If everyone has that, even the Supreme Dictator, it's hard to call that an evil empire.
Of course the idea is usually that the folks in charge are exempt from the conditioning
and use it to enslave everyone else.
Even for the other case though, where it is everyone without exception, the notion makes
me rather queasy and I doubt I'm in a minority there.
We have a term here for such civilizations, which is Post-Discontent Civilization, in
contrast to a post-scarcity civilization.
This has a fairly hazy borderline, much like brainwashing and indoctrination versus conditioning
children to act civilized, but the simple example would be as follows: In a post-scarcity
civilization people can get almost anything they want without much trouble and have a
lot of luxuries.
In a post-discontent society, everybody has been made very content with what they have,
which may be virtually nothing.
You probably indoctrinate kids in a post-scarcity civilization to avoid excess too, like wanting
their very own planet, they might still want one but feel embarrassed to pursue that request
or tell people about it, for instance, while in a post-discontent civilization they might
work 16 hours shifts everyday while coming home to a filthy rundown hovel, and be entirely
blissful about that.
This is the concept that truly terrifies us, it goes beyond even the feeling that it's
better to die on your feet than live on your knees, it's the notion that you could be
turned into a drone who is entirely happy with that existence.
That you could be totally oppressed and overjoyed about it.
This is doubly problematic, because we're aware of people for whom this is already true,
particularly for mild forms of it, and because it comes up often with artificial intelligence
too.
One of the most common proposals for dealing with intelligent machines is to make them
so that they love their work, and that's one of those thin ice areas.
On the one hand, it's certainly kinder to make an intelligent vacuum that loves cleaning
floors than one that hates it, but on the other, if we were raising kids to enjoy being
floor cleaners I think most of us would be pretty aghast at that.
The analogy might be a bit iffy though, first, there's no reason to make a sentient vacuum
cleaner, and we do not react the same to being told a kid was raised to love being a doctor
or astronomer.
Second, there's also that line between compulsory and encouragement, and the motivation for
it.
A lot of parents have some dreams in mind for their kids, but they're rarely compulsory
and typically done for that kid's benefit, or perceived benefit anyway.
Our objections tend to come when it feels like it went beyond encouragement or wasn't
really about the child's best interest.
A lot of us end up following those parental dreams and loving it, I can't even write
my own name down without being reminded I do, but that doesn't mean we were compelled
to do it or that the motivations for that encouragement were bad.
And in a wider context, picking yours kid's profession has been more the rule than the
exception historically.
Of course there's a reason why we disapprove of that nowadays.
It is one of the reasons I tend to dislike the notion of creating artificial intelligence
with preset motivations we picked with our own best interest in mind, not its own.
I think there is a genuine difference between creating an AI to be a happy vacuum cleaner
and building a bunch meant to pilot probes off to space, who are encouraged to want to
do that but still given a choice.
And a real choice too, not "You don't have to pilot the probe, but if you say no
you'll be scrapped or used to run sewage treatment plant."
Fundamentally you just avoid building something with anymore intelligence than it needs, and
thus avoid much of the problem, as being built to a task that requires intelligence and judgement
is likely less demoralizing than being built for the purpose of passing butter and for
some reason being given sentience for this task.
I think you need Informed Consent and enough leeway in the encouragement that alternatives
are both available and attractive.
Using that probe example, let's say we raised a bunch of artificial intelligences to run
probes to other worlds, one reason to do that might be because you want to be sure that
if they get out in deep space, far from supervision, they don't go off the rails and decide to
exterminate some alien planet or park in a solar system and start manufacturing warships
to come back and conquer Earth.
You have guidelines for what you want them to do and not do while they're out far from
Earth, but you want them smart so they can make good decisions and have some flexibility
to pick those and carry them out.
In such a case, being absolutely certain they will not break one of those key rules is not
only preferable, but arguably the best moral action.
Let's humanize it though, say we were launching a manned ship instead with a crew.
Not folks we raised from birth or anything, regular astronauts who entered a program voluntarily
and enthusiastically.
But we tell them the final step is they have to submit to indoctrination to follow certain
guidelines.
Very extreme indoctrination, essentially unbreakable.
Not weird or secret guidelines either, ones they've been told about during the entire
program.
They don't have to submit to the indoctrination, but they don't get on that ship if they
don't.
No other coercion, they might get picked for another program, they can leave for a new
career, they won't be blackballed or mocked for refusing, but no indoctrination, no voyage.
We've decided we simply can't risk sending out explorers who might, however unlikely,
decide that the planet they found out there with life on it should be conquered, sterilized,
or even visited and we need to be sure of them because when they're light years away
we have no way of enforcing that policy.
Tricky ethical case, because it was all voluntary, they knew what the rules and guidelines for
the mission were from Day 1 and agreed to follow them, and a "Don't you trust me?"
defense isn't exactly reasonable, because it's not them arriving at that alien planet,
it's them spending a big chunk of their life in stressful traveling conditions and
isolation before arriving there.
If I send a bunch of colonists off on a 40 year journey to colonize Alpha Centauri, but
with the caveat that if they find so much as a microbe on that planet they are to scrap
that plan, I'm going to have my doubts about if they'll stick to that.
This is the problem, we know the mind is programmable, at least to some extent, and we know we'll
get better at it, and we know there are some very good reasons to employ it.
Ethically, I would much rather tell a kleptomaniac that they could just be brainwashed into not
wanting to steal anymore and go home tomorrow rather than stick them in a cage for a year,
an expensive cage too, and so long as they've been given a choice, and both choices are
reasonable, I don't see the problem.
A coercive ultimatum, like telling them it would be life in prison or the brainwashing,
is different, so is an unreasonable option, like being brainwashed above and beyond the
negative behavior, so that they couldn't lie or do anything selfish anymore.
He stole, and so his option is to take the usual and reasonable punishment, or have that
specific bit of him adjusted so he won't repeat the behavior.
It's difficult to argue his treatment is unethical in such a case.
Tied into that, we already do a lot of voluntary behavior modification, and while one can argue
about how effective hypnosis is, people who pay for it generally assume it is effective,
which is what matters for the ethics of it.
Similarly, whether or not a medication designed to break an addictive habit is 100% effective
or just helps most folks is not our major concern.
There's not much difference in between taking a pill that makes you inclined to quit smoking
and reduces the urges versus one that absolutely and instantly removes the desire entirely,
except that the latter will sell a lot better.
The ethical issue there is if them taking it was voluntary, and if it was full and informed
consent, they knew what it did and all of what it did, including side effects, no secret
additional effect of making you ultra-loyal to the regime too.
Some folks might want to outlaw that or at least control it, prescription only or only
administered by a doctor so someone couldn't slip it to someone else, but there'd certainly
be a big market for such voluntary mind control.
Most of us would still regard such scenarios warily, but wouldn't call that brainwashing.
The effect was reasonable and they chose to do it, there was no coercion involved, or
at least no unreasonable coercion for the prisoner example.
We have to contemplate higher tech scenarios for this though, and it's good to set our
moral groundwork first, that you either agree with the reasoning thus far, and why, or do
not, and why not.
Giving someone a scientifically formulated aphrodisiac and giving them some love potion
brewed up by a medieval witch or alchemist is identical ethically if the person administering
them believes both work, it doesn't matter that the latter is just sucrose and water,
any more than you shooting someone with a gun full of blanks is okay if you thought
the ammo was genuine when you pulled the trigger.
That's important to keep in mind because, for instance, right now we invest a lot of
money into marketing and advertising, and that includes research to make it more effective,
which is blatantly an attempt to influence your behavior and mind.
This is mostly viewed as okay though as it is just influence, and people know they're
being influenced and to what end and why, and they can resist it, and the folks doing
it believe that too.
The game might be a bit different if some computer was exactly tailoring a message to
you as an individual and with such effectiveness you had no realistic way of not doing what
they desired.
That's part of the perceived danger of the various technological approaches visited in
science fiction, they are seen as not being resistible, either because they are not or
we have no experience identifying that method of influence or countering it.
Perfume or cologne is an attempt to influence people, but we know it, can detect it obviously,
and the effect is mild and easily resisted if one wishes to.
But many would be different, you might not know, you might not be used to it, and you
might not be able to resist it even if you wanted to.
There's so many avenues too.
Chemicals, visual stimuli, pheromones, hormones, subliminal messages, and so on.
Indeed pretty much anything connected to your brain to feed it nutrients or data can be
used to influence someone's thinking, and the higher the bandwidth, the more subtly
or thoroughly or quickly one can do it.
Vision is very high-bandwidth, millions of bytes of data a second, far too much for your
conscious mind to process, and thus particularly vulnerable to sneaking something through,
such as subliminal messages.
So, imagine something even more high-bandwidth, and to which we have no neurological, biological,
or culture defenses against.
This would be the brainwashing ray or direct implants into the body.
The typical DNA or RNA in a microbe or virus already has an awful lot of data in it too,
so a tailored virus or pathogen need not be limited to simply screwing with your biochemistry,
it could contain images.
We get an example of something similar in Alastair Reynolds' novels Chasm City and
Absolution Gap, with something called an Indoctrinal Virus, that can infect someone and give them
visions or predispose them to believe something, not just screw them up chemically.
And you hardly would have to limit yourself to just one virus either, you could infect
someone with a whole slew of viruses, each containing different chunks of data.
We could also send in nanomachines to do such a thing.
Needless to say if you've got neural implants already, neural laces in your head or machines
to augment mind or body, you are even more vulnerable to very high bandwidth attacks
such as neural hacking.
And your brain is quite susceptible to electromagnetism, so while the brain controlling rays or fields
of classic scifi are rather naïve about the implied complexity, it is probably possible.
We do want to be careful though, because these new methods aren't necessarily any more
dangerous than existing ones except in being unfamiliar and more sophisticated.
We can get familiar with them and our defenses can be sophisticated too.
We've been dealing with fears of mind-controlling drugs or devices for generations, and with
computer viruses for decades, and while those are legitimate concerns, nobody's brainwashed
everyone or hacked everyone's computers so far.
Defense has been slow to improve but has kept up.
It's likely to be a big market in the future though.
You buy things to enhance the mind, or improve it, hardware or software, and you buy things
to protect the mind, and people will tend to avoid a lot of new technology till it's
been tested and out there working for a while.
You might want the newest and best brain-enhancing devices but hesitate because of cost and unknown
risk, you don't want the buggy or vulnerable new stuff, or to be offline waiting for a
patch.
Beyond that we have the concern of something radically new getting in before we can defend
against it.
Or something developed and deployed in secret by some shadowy group.
But we're mostly worried about the slippery slope of good intentions.
That's a very legitimate concern, and the scariest part about it is that technology
is dangerous and brainwashing is actually one of the best defenses against dangerous
tech.
As an example, one of our big concerns is someone might develop a way to make nanotech
or 3D printers or replicators that can make just about anything from a blueprint, so someone
could make a doomsday device or super-virus in their basement, any crazy single lunatic
could wreak havoc on us or even destroy us all, just one lone wolf.
If a shadowy group or totalitarian government controlling us is a threat, at least it's
usually assumed to be somewhat sane, just villainous.
We've no shortage of individuals who are crazy and the idea that any one of them might
kill us all could drive a society to want to limit such technologies.
The alternative to such limits on tech is limits on minds instead.
Imagine if we felt the only way to keep us all safe would be to all get mind scanned
for dangerous tendencies and controlled to prevent them.
These could still be somewhat voluntary and customizable too.
You might give everyone an implant that prevented them from engaging in mass murder, but you
might let folks pick between a range of options instead and based on their security risk.
You can't learn certain sciences without agreeing to being conditioned against using
them for certain purposes or teaching them to others without permission.
You can not operate a 3D printer or train to use one without agreeing to be conditioned
to not use, make or distribute banned or restricted templates.
You might get to select between being conditioned to be non-violent or be able to pick instead
to have your mind scanned occasionally for instability or be followed by a drone that
watches you.
Some folks might prefer conditioning to not do something they really don't want to do
anyway if that exempted them from privacy intrusions.
This raises the slippery slope issue, even assuming such a civilization isn't already
off the cliff and over the moral event horizon, but it also raises one of the weirder Fermi
Paradox solutions.
With the Fermi Paradox, the question of why the Universe seems absent of other intelligent
life even though it is ancient and immense, we always have a problem of why civilizations
don't spread out.
Common suggestions are that they can't, because space travel might be impractical,
or because civilizations kill themselves off, or because intelligence is just super-rare.
Alternatives tend to focus on why civilization might not want to spread out to the galaxy.
A point I once raised in discussing this is that it doesn't matter if most people in
a civilization don't want to colonize the galaxy, because some of them probably will,
and if it is practical to do so, then it only takes a handful of people in a civilization,
any civilization, to colonize the whole galaxy.
Unless you are willing to flat out blow up any colony ship that tries to head out of
your system, it really doesn't matter if most of your people don't want to colonize
the galaxy.
But if your civilization feels that limited mind control is the only way to keep everyone
safe, colonization could get to be rather dangerous to you.
All those folks elsewhere, separated by decades or centuries of light lag, are hard to monitor
or reinforce their conditioning if it weakens or to do anything about if they skip off track.
Remember, this could be a real danger too, it's possible just one person could make
a doomsday device you can't defend against, and such a thing could be manufactured at
Alpha Centuari and sent back to Earth too.
One colonist breaking their conditioning might be able to walk over to the colony's 3D
printer, and a minute later have a device that lets them takeover their whole colony
and send back a genocidal armada to Earth and all of its colonies too.
Alternatively a solar system is a big place, and we've seen how many people you can put
in one and how long you can extend the lifetime of a sun or protect yourself from natural
threats.
You might feel that is more than enough and more than safe enough, and if such technologies
exist they mess with our Exclusivity issue with the Fermi Paradox.
We often toss out Fermi Paradox solutions not because its improbable a civilization
might do something, but because it's improbable every civilization would, space travel wouldn't
seem exclusively limited to peace-loving aliens who dislike meeting primitive cultures, or
don't like to interfere with them, so solutions reliant on aliens staying away from Earth
out of disinterest or principled non-interference don't work well.
Even though many would probably do one or both.
However, if technologies exist which are ultra-dangerous and can be easily created by any one person,
that's a threat every civilization would have to deal with and not many solutions come
to mind, indeed that would seem like your options are extinction or mind control, though
I would imagine, or at least hope, there were some alternatives.
If there were not, you might easily have a Universe that was full of nothing but isolated
mind controlled worlds as islands in a vaster sea of empty or dead ones.
Nobody expands out much for safety, and nobody talks much because there's not much to gain
from doing so, and it does enhance risk.
Including the risk that another civilization might think your safety controls weren't
good enough and come by to enhance it with better mind control of their own, or just
wipe you out.
The potential gain, new technologies and new ideas, new science or art or philosophies,
what we tend to view as the big boon of meeting a new civilization, is probably not very attractive
to them since those could rock their very fragile boat.
I don't think this scenario is too likely, indeed I tend to suspect that we will constantly
be improving all our counter-measures for dangerous new technology right along with
that new technology, but it drives home the point that mind controlling technology is
potentially very seductive even to civilizations that are pretty benevolent and free of corruption,
even ignoring how easy it is to slide into a totalitarian police state--or ironically
even worse, a totalitarian state that doesn't need police anymore.
It's a really scary thought and for that reason one popular in fiction.
From films like Clockwork Orange to books like Lowry's The Giver or Huxley's Brave
New World, or scifi episodes like Star Trek: The Next Generation's "Chains of Command"
or Blake's 7, we see a lot of authoritarian dystopias that use such methods, and often
arising from good intentions.
The Big Brother of all of these fictional works, though, the one that inspired so many
others and terms we regularly use nowadays like Big Brother, is George Orwell's 1984
and it really paints a portrait of how you don't even need sophisticated technology
or a contrived plot for how the grim, authoritarian, essentially invincible police state can arise.
I also find it rather grimly amusing that the book has often been banned in various
times and places as subversive or corrupting.
A very influential work, as mentioned, and one adapted to film or TV quite a few times,
often quite well too, though as usual, the book is better.
If you haven't read it, I certainly recommend doing so, and you can pick up a free copy
of 1984 today, just use my link in this episode's description, Audible.com/Isaac or text Isaac
to 500-500 to get a free book and 30 day free trial, and that book is yours to keep, whether
you stay on with Audible or not.
So a pretty grim topic today, but an important one.
Next week we'll be looking at something rather more upbeat, as we start off the Earth
2.0 series by looking at Seasteading and making artificial islands, and we'll move on a
couple weeks later to explore deeper seas with Colonizing the Oceans.
Before that though, we'll return to the Generation Ships series to contemplate how
you would keep a culture strong and stable on such a ship over the many millennia it
might need to exist to achieve its mission, and just how long such a ship could be deployed,
in "Ark of a Million Years" As a last note, we've talked occasionally
of doing an end of the month livestream for Q&A, and we'll be doing our first one this
upcoming Sunday, September 30th, at 2pm Eastern, 1800 UTC.
We'll continue doing a monthly livestream after that, though we'll figure out the
time, dates and show format as we go.
For this first time, though, it will be this Sunday Afternoon, and I hope to see you then!
For alerts when those and other episodes come out, make sure to subscribe to the channel
and hit the Notifications bell.
And if you enjoyed this episode, please hit the like button and share it with others.
Until next time, thanks for watching, and have a Great Week!
-------------------------------------------
my summer car türkçe / bölüm 1 / 2018 - Duration: 18:17. For more infomation >> my summer car türkçe / bölüm 1 / 2018 - Duration: 18:17.-------------------------------------------
Hello Its Halloween | Lego Dance | Scary Rhymes For Children | Kids Channel - Duration: 4:06.Watch out..
The monsters around if your all alone give your friends a shouts...
The ghosts and the spooks coming out of the nooks
As your door bell ring can you here them sing..
Hello, It's Halloween
Hello, It's Halloween
Hello, It's Halloween
Hello, It's Halloween
The air is cool and the moon is full turn on the light before the vampire bite
The witches are cooking the zombies are looking..
As your door bell ring can you hear them sing...
Hello, It's Halloween
Hello, It's Halloween
Hello, It's Halloween
Hello, It's Halloween
Watch out
The monsters around if your all alone give your friends a shouts...
The ghosts and the spooks coming out of the nooks.
As your door bell ring can you hear them sing...
The air is cool and the moon is full turn on the light before the vampire bite
The witches are cooking the zombies are looking..
As your door bell ring can you here them sing.
Hello, It's Halloween
Hello, It's Halloween
Hello, It's Halloween
Hello, It's Halloween
-------------------------------------------
Great Opportunity To Travel Russia Visa Without Agent & Requirements 2018 - Duration: 5:15.Subscribe Now
-------------------------------------------
Overwatch Moments #176 - Duration: 10:47. For more infomation >> Overwatch Moments #176 - Duration: 10:47.-------------------------------------------
The Undertaker vs. Triple H (WWE Network Collection Intro) - Duration: 2:09.[MUSIC]
Ladies and gentlemen, we are experiencing history.
>> [APPLAUSE] >> Two of the all time
greats to ever step foot inside the ring.
>> It has been six long years, since one of the greatest
rivalries in WWE history was said to be dead and buried.
>> But legends, legends never die.
>> At WWE Super Show-Down in Melbourne, Australia,
The Undertaker and I have some unfinished business.
>> There's only one thing left for The Game, that is to end the Undertaker.
>> Frustration.
>> Stay down.
>> Desparation.
>> What's wrong with you?
>> The emotion overcoming The Game with his obsession.
>> What? >> How close could you get without getting
a three?
>> Beating The Undertaker on a big stage is the missing piece on Triple H's legacy.
>> I'll give you one more chance at immortality.
>> The one rival Triple H has never defeated.
>> You want an end?
You got it.
[MUSIC]
>> It is a fantasy match.
>> Two legends will collide.
[MUSIC]
>> An image we can never replicate.
>> Two icons will do battle.
>> And a moment that will lead up in infamy in WWE.
>> And Undertaker, I promise you, this is no game.
[MUSIC]
The last chapter in one of the WWE's most storied rivalries.
The Undertaker, Triple H, for the last time ever.
>> Time to play The Game.
-------------------------------------------
Who Should Buy The iPhone XS? - Duration: 6:00.- Hey guys, this is Austin.
The iPhone XS is one of the best phones you can buy today,
which makes sense as it's also one of the most
expensive flagships ever.
As opposed to doing our normal Is It Worth It
video this year, instead I think it's a lot
more useful if we answer the most important question
with any new gadget launch.
Who should buy the iPhone XS?
One big reason to pick up the XS or the XS Max
is if you're already in the Apple ecosystem
and you're looking for an upgrade,
especially if you're coming from an older device
such as an iPhone 6 or 6s, this is a big step forward.
Take the camera, for example.
Put it side by side with the 6s and you will
notice a big difference.
Now part of this is because there are
several generations between the phones,
which means that things like color science,
dynamic range, and the video quality
has all been improved.
But you're also getting additional functionality,
including the second telephoto option,
which is especially useful for the portrait mode,
which is actually pretty good on the XS.
It's also hard to overlook the difference
in screen size.
Gone are the days of huge bezels on top and bottom.
Instead, the new iPhone is basically all screen
up front, including the notch.
Oh yes, my friends, the notch.
Look, personally I don't mind the notch.
Not only is it on basically every flagship
phone out there, but at this point, while sure,
you might think it looks ugly,
almost anyone is going to be able
to get used to it very quickly.
That notch houses one of the biggest differences
between the XS and previous generations.
Face ID.
Put simply, instead of having a home button
with Touch ID, instead you use your face
to unlock the X and the XS.
And, generally speaking, it works pretty well.
Certainly not perfect,
but as far as I'm concerned, not bad.
There are also some software updates from legacy phones,
and some of which are pretty substantial.
One of my favorites is the gesture-based navigation,
since you don't have a home button to be able
to exit apps, as well as swipe in and out
of multi-tasking and whatnot.
This makes it a lot faster to be able
to just swipe on the bottom of the screen.
Performance is also a consideration.
Every year, the iPhone does get faster.
And while the A-Series processors have been
really impressive, especially the last couple years,
when you pit the XS compared to the X,
and especially going back to the 7 and the 6S,
you're gonna notice a pretty substantial difference.
Sure, the camera is a solid update this year,
but it is certainly not worth dropping
another $1000 to get.
Same thing goes for the performance.
Yes, on paper the XS is faster,
but realistically, there's almost no
real world difference.
More than any phone I've used in years,
the iPhone X has held up so well
over a full year of use.
The battery life still lasts all day long.
It still feels fast, the camera is great.
If you have one and you're not, like,
rocking a broken screen or something,
it is almost impossible for me to recommend
it over the XS.
No, no, I didn't actually mean that.
That's the wrong,
I flipped that.
The X still feels every bit as fast
as it did last year, and even more so
now that iOS 12 is out.
Speaking of, let's actually talk about iOS 12
for a second.
As opposed to most updates,
which make your iPhone feel slower,
12 legitimately does make even older
iPhones feel a lot faster.
I'll be real.
If my job wasn't to review the latest tech,
I would not have updated my iPhone X to a XS.
It's good, but the upgrades are just not worth it.
On the other hand, one reason why you might
wanna pick up the XS is longevity.
It's really impressive that Apple is still
supporting the iPhone 5s after five years.
And honestly, the software support on the iOS side,
it's one of the major advantages over Android
in my opinion.
If the XS gets that same level of support,
it is a serious selling point.
Sure, $1000 for a phone is expensive.
But when you break that up over three,
four, or even five years of usable,
well, use, that's pretty cool.
There's also the fact that the updated screen design
is the new standard going forward,
which is a good sign for app support.
Long story short.
If you want to pick up a new iPhone today,
the XS has a good shot of being able
to last quite a while.
On the other hand, not everyone wants to spend
$1000 on a new phone.
Sure, the XS is great, but there's a lot of
other things you could do with $1000.
Like pay rent, or buy a lot of Subway sandwiches.
- [Ken] What?
- [Matt] $5.00 footlong.
- That's a lot of $5.00 footlongs.
That's-- - They're not doing that.
- 200 of them. - They're not, they're not
doing that anymore. - No.
- Wait, they're not?
- [Matt] That's not a thing anymore.
- Not only can you pick up a much cheaper
Android phone such as the Pocophone F1
for a lot less than half the price,
but there are also other iOS options
that can do a decent job of not costing $1000,
such as the iPhone 7, which is now $450.
Sure, it's probably not going to last as long,
but you're still getting a solid camera,
really reasonable performance,
and it costs less than half the price of the XS.
One point in favor of the XS,
specifically with the Max, is if you're really
into media consumption.
Which, odds are if you're watching this video right now,
you probably are.
Based on early numbers, it seems like the XS Max
is heavily outselling the standard XS,
and that's really only for one reason,
the fact that it's got that bigger display.
Besides that, there's actually really no difference
between the XS and the XS Max.
The 6.5 inch OLED display is about as good
as it gets for a smartphone.
Not only is the brightness, and the color,
and especially the contrast, terrific,
but it does support all the fancy new
display technologies, including HDR10
and Dolby Vision.
The improved speakers also make a difference.
Not only are they louder, but they also have
better stereo separation.
And that, combined with the screen,
makes the XS Max an excellent media consumption device.
Honestly, the main reason to not buy
the iPhone XS is really simple, the iPhone XR.
Take everything that's good about the XS,
add a wide variety of colors,
remove the telephoto camera while still
keeping the main camera intact,
and replace the OLED display for an IPS panel,
and boom, you've got yourself an iPhone XR,
which just so happens to be a full $250 cheaper.
That's a big difference compared to past generations.
Previously, Apple would knock about $100 off of
the entry-level iPhone.
But this year, you're getting the XR with almost
the exact same spec but with a major price cut.
Something that's easy to miss is that while the XR
does have a lower pixel density,
which is on-par with the iPhone 8,
but the actual screen size itself is right
in-between the XS and the XS Max.
The XR won't be out until late October,
so I can't give it my full recommendation just yet.
But on paper, it really does look like the full package.
You just aren't missing much compared to the XS.
And that is my recommendation.
If you really want the best and have the
budget for it, by all means pick up the XS.
It's a great phone.
For almost everyone else though,
I really feel like waiting for the XR is worth it.
-------------------------------------------
Articulated doll - Crochet doll Mia and Mei - Duration: 4:36.For more information (in Spanish), visit:
ganchigurumi.com/blog
Correction: free pattern in English is in my blog
ganchigurumi.com/blog (sorry for the inconvenience)
Thanks for watching!
-------------------------------------------
FNAF SFM: Best Five Nights at Freddy's Animations Compilation - Duration: 10:01.
-------------------------------------------
5.000 ABONEYE ÖZEL EFSANE MONTAJ - (DİNAMİKMAN AİLESİ 5.000 KİŞİ) - Duration: 2:37. For more infomation >> 5.000 ABONEYE ÖZEL EFSANE MONTAJ - (DİNAMİKMAN AİLESİ 5.000 KİŞİ) - Duration: 2:37.-------------------------------------------
Дом 2 новости 28 сентября 2018 (28.09.2018) Раньше эфира - Duration: 5:04. For more infomation >> Дом 2 новости 28 сентября 2018 (28.09.2018) Раньше эфира - Duration: 5:04.-------------------------------------------
Cristin Milioti Movies List - Duration: 0:48.Cristin Milioti Movies List
-------------------------------------------
GTA Online Stunt Race "Trench I"(1:01.850)best lap - Duration: 1:52.These records never satisfied me.
I wanted to get 1:01 so badly...
I thought 1:01 was around the corner, but I was wrong.
1:02.5~ very hard 1:02.0~ super hard 1:01.x~ Ninja
These attempts made me crazy.(as always)
But they had made me stronger little by little, and...
Finally the time has come.
oohomuij's trick (please check his video.)
God speed me...
OK
Should have glided further, noob (lost 0.2 sec)
My 1450 laps paid off.
Thank you for watching!
Không có nhận xét nào:
Đăng nhận xét