Thank you for the video, I found out that investing is not for everybody, you just need a strong stomach too see your portfolio go down. It might be wiser for a novice to start with copy trading investing, but it is not easy. To invest in growth stocks it is another level, definitely you need to know what are you doing.
From my own point of view, you need to invest smartly if you need the good things of life. so far i've made over $255k in raw profits from just 6 months into the market from my diversified portfolio strategy and i believe anyone can do it you have the right strategy, mutual funds takes long time but investing smartly is the key for short term. Most of us tend to pay more attention to the shiniest position in the market to the cost of proper diversification.
I've been solely investing in real estate. But with the recent hyper home pricing i've liquidated a few things and have $45k in cash laying around idle. Would love to get your recommendations, I'm in search of something lucrative in the current crazy markets, i will be glad.
My portfolio is very much diversified so it's not like i have a particular fund i invest in, plus i dont do that by myself. i follow the trades of Mrs Karen Gaye Gray.
She is a popular broker you might have heard of. I can correctly say she's worth her salt as a financial advisor as her diversification skills are top notch, because i see that in her results as my portfolio grows by averages of 10 to 15% on a monthly basis, unlike i can say for my IRA which has just been trudging along, my portfolio just mirrors what she trades and not just on some particular industries of my choosing.
I'm sure there's a 4th Amendment case (unreasonable searches) against police departments using this technology. Unfortunately the courts are treating everything online, no matter how carefully curated or secured, as "public" and outside a reasonable expectation of privacy. This is the sort of thing that necessitated the EU's GDPR, a variation of which California is trying to implement. But the US courts seem to be cavalier to the consequences of striking it down in the name of "freedom". I don't think any American reasonably expects their likeness to be used as training data for an Orwellian search engine to be used by the state. And that's before we get into the abusive behavior of Clearview. They "disappeared" the reporter investigating them in their model, spooking the cops whom she was interviewing. If they're willing to do that kind of Soviet/Chinese power play, they're clearly willing to do anything with our data.
Expectation of privacy is ONE thing. Having one's photo and/or personal information published by "anonymous strangers" with vile and abusive comments is quite another. At what point is your own imagine AND your own name...YOUR OWN...and no one else's?
Well, this is highly disturbing. I remember when the internet came out. How great it all seemed. Having a library at your finger tips. Turned out to be the downfall of Humanity.
The brazen attitudes of these companies always surprises me. This isn't technology that can be squashed altogether, but our governments can eliminate the legal/punitive weight behind a computer's guesswork... people shouldn't be arrested for the crime of looking a certain way
We desperately need younger (and sane) people in congress, who understand technology. While we can't stop the tech from existing we can legislate against its use, same as those other countries have done, same as revenge porn has been legislated out of the mainstream. The implications of this tech being used, no matter the 'think of the children' benefits (that's always the first thing that is said when any horrific law or invasion of privacy is being snuck through with wide-reaching implications), will be devastating if not reigned in. The cop example where the company unilaterally decided what the cops could and couldn't see points to both the ability and action of a company targeting whom it wants.. meaning it can suddenly increase false-positive against someone it dislikes, hide people its friendly with (or criminals who 'subscribe'), etc. National security implications were called out, but the implication of informant info being casually revealed will have a chilling effect on whistleblowers - especially against governments. And any lock left open for state actors (eg police) is almost automatically available to criminals - meaning * blackmail scams (with actual blackmail pulled up by AI, since all of us have something we don't want our employers or families knowing), * intimidation campaigns (finding and targeting anyone associated with Candidate-X, quickly identifying anyone at a protest.. and their entire social network), * 'moral' campaigns (outing lgbtqia+ folk, targeting anyone who helps women needing to exercise bodily autonomy, singling out young people at spring break who end up in photos, hunting down people who have certain bumper stickers or who fly Trump flags or whatever), * kidnapping and other social manipulation tactics ('oh its ok, I am a friend of [person x, y, and z identified from a social media post], we all go to [location x identified from photos]), * social 'proofing' (you want to join our [group], lets do a quick search on your entire life history, not only in your social media but in every photo you may have ever appeared in across your entire life, even that taken by casual people where you're walking around in the background... hmm.. I see 23 years ago your wife went to this rally / protest / gathering... don't think you're 'our kind of people') and worse. This is beyond horrific and probably the most socially destructive technology I've ever heard of.
Thank you so much for this--I think it's too late to be reigned in--and can you imagine what uses to which it's being put aside those you've bravely uncovered? So very creepy and scary.
Who repays the costs and impacts of being falsely accused? The company which provided the tool which provided a false positive should bear the costs of the misuse of their product. We have product liability laws for a reason -- this is a perfect use case. Like private credit bureaus controlling your life, these private biometric companies need regulation or your public life can be destroyed by a non-governmental agency.
I'm not against the use of facial recognition, because it can probably catch a lot of criminals and decrease the opportunity for people to commit all sorts of crimes -- it can make the world safer. But there has to be *serious* regulation with careful consideration of the ethics and all the possible scenarios that might arise from the use of such technology.
In a perfect world... one would think that AI is used for well intended purposes and we'll intended people. Until your name is being matched with someone else's photo and vice versa... and all the variables. Many criminals do that.
It has become more difficult for a person to prove that they are really THEMSELVES... than for criminals to use their identity with all this technology for many years already.
We need a set of social norms that increase the likelihood that most of the time most people will believe the polite thing is to mind their own business. Additionally we need pervasively understood ways to make it clear when it is alright, even welcome, for someone to take advantage of the knowledge they can easily acquire about you. All in ways consistent with the rule that you should always have solid, safe, effective, and reliable, ways to ignore those who know a lot about you - they don't interfere in your life in any way you care about, so what does it matter they know. However, that is a condition that does not yet consistently exist.
Remember when we all got the notice for 5G and we all would have to respond to it. We all had our faces snapshot at that time. Who got those snapshots? If you owned a phone you got your photo took. We had no control over that. So is that a US mug shot book?
I remember about 10 years ago when free apps came out where you could put your own face on funny animated Christmas card charactures and it came out it was collecting facial images for facial recognition and other future use.
Thank you for the video, I found out that investing is not for everybody, you just need a strong stomach too see your portfolio go down. It might be wiser for a novice to start with copy trading investing, but it is not easy. To invest in growth stocks it is another level, definitely you need to know what are you doing.
From my own point of view, you need to invest smartly if you need the good things of life. so far i've made over $255k in raw profits from just 6 months into the market from my diversified portfolio strategy and i believe anyone can do it you have the right strategy, mutual funds takes long time but investing smartly is the key for short term. Most of us tend to pay more attention to the shiniest position in the market to the cost of proper diversification.
I've been solely investing in real estate. But with the recent hyper home pricing i've liquidated a few things and have $45k in cash laying around idle. Would love to get your recommendations, I'm in search of something lucrative in the current crazy markets, i will be glad.
My portfolio is very much diversified so it's not like i have a particular fund i invest in, plus i dont do that by myself. i follow the trades of Mrs Karen Gaye Gray.
She is a popular broker you might have heard of. I can correctly say she's worth her salt as a financial advisor as her diversification skills are top notch, because i see that in her results as my portfolio grows by averages of 10 to 15% on a monthly basis, unlike i can say for my IRA which has just been trudging along, my portfolio just mirrors what she trades and not just on some particular industries of my choosing.
By following her trades do you mean copying her trades? I have heard about copying trades
Not testing the software on all types of people before real world deployment is almost criminal negligence.
Corporations thrive on criminal negligence
Testing the technology on millions of people’s photos without express consent AND clear means of recourse is criminal…almost, or not.
I'm sure there's a 4th Amendment case (unreasonable searches) against police departments using this technology. Unfortunately the courts are treating everything online, no matter how carefully curated or secured, as "public" and outside a reasonable expectation of privacy.
This is the sort of thing that necessitated the EU's GDPR, a variation of which California is trying to implement. But the US courts seem to be cavalier to the consequences of striking it down in the name of "freedom". I don't think any American reasonably expects their likeness to be used as training data for an Orwellian search engine to be used by the state.
And that's before we get into the abusive behavior of Clearview. They "disappeared" the reporter investigating them in their model, spooking the cops whom she was interviewing. If they're willing to do that kind of Soviet/Chinese power play, they're clearly willing to do anything with our data.
Expectation of privacy is ONE thing.
Having one's photo and/or personal information published by "anonymous strangers" with vile and abusive comments is quite another.
At what point is your own imagine AND your own name...YOUR OWN...and no one else's?
Sorry...your own IMAGE.
Auto-correct mistake.
And yet people keep shoving pics of themselves onto the internet. Been telling friends not to post pics of their kids for years for reasons of privacy
Well, this is highly disturbing.
I remember when the internet came out. How great it all seemed. Having a library at your finger tips.
Turned out to be the downfall of Humanity.
You didn’t read 1984?
These MF's don't read, ask if they've seen the movie😂
In an age of mistrust already we now have to deal with this. Living in a world of un known while acting like it's normal. Hasn't life become a chance?
The brazen attitudes of these companies always surprises me. This isn't technology that can be squashed altogether, but our governments can eliminate the legal/punitive weight behind a computer's guesswork... people shouldn't be arrested for the crime of looking a certain way
Do these people falsely arrested and detained due to faulty Ai at least get their legal fees covered by police or the Ai company ?
don't forget the civil suits that the taxpayers will end up paying one way or another
We desperately need younger (and sane) people in congress, who understand technology. While we can't stop the tech from existing we can legislate against its use, same as those other countries have done, same as revenge porn has been legislated out of the mainstream. The implications of this tech being used, no matter the 'think of the children' benefits (that's always the first thing that is said when any horrific law or invasion of privacy is being snuck through with wide-reaching implications), will be devastating if not reigned in. The cop example where the company unilaterally decided what the cops could and couldn't see points to both the ability and action of a company targeting whom it wants.. meaning it can suddenly increase false-positive against someone it dislikes, hide people its friendly with (or criminals who 'subscribe'), etc. National security implications were called out, but the implication of informant info being casually revealed will have a chilling effect on whistleblowers - especially against governments. And any lock left open for state actors (eg police) is almost automatically available to criminals - meaning
* blackmail scams (with actual blackmail pulled up by AI, since all of us have something we don't want our employers or families knowing),
* intimidation campaigns (finding and targeting anyone associated with Candidate-X, quickly identifying anyone at a protest.. and their entire social network),
* 'moral' campaigns (outing lgbtqia+ folk, targeting anyone who helps women needing to exercise bodily autonomy, singling out young people at spring break who end up in photos, hunting down people who have certain bumper stickers or who fly Trump flags or whatever),
* kidnapping and other social manipulation tactics ('oh its ok, I am a friend of [person x, y, and z identified from a social media post], we all go to [location x identified from photos]),
* social 'proofing' (you want to join our [group], lets do a quick search on your entire life history, not only in your social media but in every photo you may have ever appeared in across your entire life, even that taken by casual people where you're walking around in the background... hmm.. I see 23 years ago your wife went to this rally / protest / gathering... don't think you're 'our kind of people')
and worse.
This is beyond horrific and probably the most socially destructive technology I've ever heard of.
Spot on. Frightening.
Thank you so much for this--I think it's too late to be reigned in--and can you imagine what uses to which it's being put aside those you've bravely uncovered? So very creepy and scary.
In Europe they have some privacy laws. The US should follow suit.
Who repays the costs and impacts of being falsely accused? The company which provided the tool which provided a false positive should bear the costs of the misuse of their product. We have product liability laws for a reason -- this is a perfect use case.
Like private credit bureaus controlling your life, these private biometric companies need regulation or your public life can be destroyed by a non-governmental agency.
In the meantime a person can be bankrupted with legal fees.
The Machine!
China has done this for years. What's next? Giving people scores
They probably do not care if you are not guilty either.
Already happening!
search for "All your base are belong to us" - from a 1989 arcade game
I think I saw this Black Mirror episode.
It is mind-blowing that face recognition can identify me from a picture when I was a toddler.
Tragic missed opportunity to use "Your Face Are Belong to Us"😢
Claiming themselves to be wise (in THEIR OWN ESTIMATION) they became FOOLS.
I'm not against the use of facial recognition, because it can probably catch a lot of criminals and decrease the opportunity for people to commit all sorts of crimes -- it can make the world safer. But there has to be *serious* regulation with careful consideration of the ethics and all the possible scenarios that might arise from the use of such technology.
In a perfect world... one would think that AI is used for well intended purposes and we'll intended people.
Until your name is being matched with someone else's photo and vice versa... and all the variables.
Many criminals do that.
It has become more difficult for a person to prove that they are really THEMSELVES... than for criminals to use their identity with all this technology for many years already.
Craziness.
We need a set of social norms that increase the likelihood that most of the time most people will believe the polite thing is to mind their own business. Additionally we need pervasively understood ways to make it clear when it is alright, even welcome, for someone to take advantage of the knowledge they can easily acquire about you. All in ways consistent with the rule that you should always have solid, safe, effective, and reliable, ways to ignore those who know a lot about you - they don't interfere in your life in any way you care about, so what does it matter they know. However, that is a condition that does not yet consistently exist.
Remember when we all got the notice for 5G and we all would have to respond to it. We all had our faces snapshot at that time. Who got those snapshots? If you owned a phone you got your photo took. We had no control over that. So is that a US mug shot book?
Not if the grid goes down
The facial recognition my phone uses opens 1/3 of the time.
That you know of.
Some programs can open your audio and video and such without your knowledge.
I remember about 10 years ago when free apps came out where you could put your own face on funny animated Christmas card charactures and it came out it was collecting facial images for facial recognition and other future use.
masks don't seem so bad now
😳😲😬
You missed a golden opportunity to use “All your face are belong to us”
holy mother of god
How about that for a video title? Got grammar?
Like it!
What's wrong with the grammar in the title?