Sunday, April 30, 2017

Pillow Talk

I liked the idea of pillow talk, but it does seem to be incomplete. The app would not help with women who are passed out due to alcohol or drugs (roofies), but it would help with women that knew they were in danger. The app's name pillow talk is supposed to keep the scent off potential predators, but it seems like the app would be found out pretty quickly by some kind of NowThis video.

Cultural Misappropriation

At first, I didn't like their idea for the app because culture is meant to be shared and spread. Culture is something to be enjoyed by everyone, but while culture is meant to be shared, it is not meant to be stolen. I can agree with their presentation to an extent, but I do disagree with Miley's twerking being sexy. I agree that people can take things to far, such as wearing headdresses, and that the cure for disrespect is often the same as the cure for ignorance. Actually learning about the cultural significance of something before just straight up either doing it or wearing it.
Tanquesha, Carrington, and Dena's (sorry if any misspelled names) culture appropriation app was interesting.  The app was created to help prevent cultural misappropriation, which means associating certain aspects of another's culture wrongly and offensively.  The app would be created to help keep people aware of others' culture, especially with how one dresses.  For example, if a person of white ethnicity decides they want to wear a bindi one day, they can tell the app what race they are and what they want to wear that is a part of another ethnicity, and the app will tell them if that is appropriate to do or not.  I think this app has good intentions because preventing cultural misappropriation and the potential "making fun" of other ethnicities is a good act.  However, I do not think that people would actually put forth the effort to download this app or use it.  Usually people just google what is okay and what is not okay to do when it comes to wearing things of other culture.  I liked the idea though because it was interesting and unique.

Hunger Watch

Jacob Armando and Sebastian came up with the idea of Hunger Watch. I really like this idea because of the issue of world hunger today and the issue with homelessness. The way this works is through a beeper where a homeless person has one and a business has another. The business sends out a message to any beeper with in a 2 mile radius of the business. I think it will definitely help with hunger issues but we can't get around the fact that this is illegal.

Culture Appropriation

Carington Dena and Tanquesha (I'm sorry if I misspelled any names) made an app for culture appropriation. Their app explains their aims and what culture appropriation is. I think the idea was good but I'm not quite sure if it will be affective. The way this app works is you type in your ethnicity and then what you plan to wear and the app tells you if it is appropriate or not. I like the idea being from a Choctaw decent and knowing the sacredness behind a headress. I just don't think this will be very affective with the stubbornness of our society.

Friday, April 28, 2017

"Hunger Watch"

The Hunger Watch is essentially a beeper that is connected to a program that allows businesses to donate their wasted food to homeless people. The way that the program is supposed to work is that various restaurants donate food that they would have otherwise thrown away to the homeless. The Hunger Watch will beep when their food is ready. The businesses would have to sign up and log all of the food that they are giving away, and in return, the Hunger Watch organization will allow them to get a tax break. The people who are receiving this service will also register some type of way to ensure that no one is taking advantage of the system. While this is a good idea, I think that restaurants would be much more willing to give away their wasted food if the organization was marketed as a charity. There would be no question of whether or not the food was really going where it should be or not because the creators of the Hunger Watch would be the ones facilitating the process. Many businesses would be more open to donating to a charity than directly to people. If the organization had its own facility, they could take the food back to their own facility and distribute it themselves. That way restaurants wouldn't have to use much of their own labor to contribute to a cause that doesn't necessarily benefit them that much.

Thursday, April 27, 2017

"Pillow Talk"

     “Pillow Talk” is a neat app that users can download on iOS or android. The concept of the app was developed by my fellow classmates: Ashely Childs, Daniel Ashcraft, and Mollie Wadsworth. The app is designed specifically for rape and molested victims and potential victims. The app is free and allows users to alert the authorities if the users feel victimized. However, the police department will not respond immediately, to satisfy the conditions, the app is sent a message of “are you okay?” If the user does not respond within the time span of two minutes, then the police department will dispatch and arrive at the scene. The app also has a nifty feature to allow the users to scan their immediate areas for potential sexual predators, letting the user dictate whether their immediate area is safe or not and if they are comfortable with it. The app is currently under development by my classmates and a small group of engineers. My classmate is currently attempting to bring awareness to the city of Memphis for making the community we live and study in a better place. I personally think that they should maybe start small with the app. Maybe they could get the app developed first and use the school as a Guinea pig and test it. Have the CBU community test the app first, then slowly spread the word of the app’s effectiveness. How well the app is effective in practice will increase the probability of having their voices heard, because apparently, no one of importance wants to listen to hard working young people who just want to better their community. I think with the slow and steady method they will succeed in their noble endeavor to better our community with a fast, free, response for women and men who feel threatened and victimized.  I’m cheering them on with spirit, I wish them luck!

Monday, April 24, 2017

San Junipero

San Junipero is one of my favorite Black Mirror episodes! It starts off with Yorki and it's her first night in San Junipero. She goes into a bar/dance club called Tucker's and is just looking around. She spots the arcade part of the club and starts playing a game. She doesn't take any care in the guy at the club but while sitting in a booth a girl name Kelly shows up. She seems to show a strong interest in Kelly but gets overwhelmed whenever they dance for the first time. Kelly shows the same sort of interest and asks Yorki to come with her to her house. They know that they are both tourists and can only stay until midnight. After a while Kelly visits Yorki and marries her instead of Greg doing it so that she can have Yorki pass over. Eventually Kelly dies as well and they live in San Junipero together. But I wonder what it would be like years down the line. Will it get boring? Will people start to get sick of each other? There are plenty of questions but we like to think it ends happily considering this is one of the happier episodes.

Sunday, April 23, 2017

ONline

San Junipero hence referred to as Online because San Junipero is a pain to spell out. Online seems like a pretty cool place, maybe even cooler than real life which could be a really big problem. I really hope someone in the future placed some sort of age limit on this or some kind of stipulation that users must be in a nursing home to use it, but I might not have to worry about that since there was a time limit for the old people. The problem with alternate realities, and my argument for why they shouldn't exist unless under extreme laws and regulations, is that they're better than real life could ever be because they're easier and you have more control to do things that you could never do in real life. That seems like it'd be awesome because it would be, but take into account the damage that videogames have had on people and their social lives and outdoor activities, now imagine what effect alternate reality would have. The streets are already empty, a testament to videogames and paranoid parents, but with the advent of alternate reality, people would never leave. Activities such as academics and going out would fall off by the wayside in exchange for this new activity. Add the online social features that it would have and there would be no real need to leave the house since you'd be able to see the version of your online friends that they'd want others to see. There would be two possibilities (a) nothing really changes, its just all online or (b) people prefer the games where they're the hero and the world is full of a bunch of interactive, interesting npcs. If (a) happens, there could actually be some improvement in interactions and the way we understand the world although an extreme negative in health outlooks, but if (b) happened, it'd be all over. Both would probably see an extreme decline in fertility because npcs are probably easier (I mean all senses of the word) than other human players and especially in real life since exercise would probably over time come to be unpopular. Don't believe me? Look at how much people spend on social media, video games, and just on computers in general. I previously stated that academics would fall to the wayside, and let me go into that a little bit. School is hard and usually boring, and if there's a somewhat affordable option that doesn't involve going then everyone but a few would take it. People would still need food and other stuff like that, but we're getting to the point that machines can do everything we can do and oftentimes better. Most if not all of the jobs are probably going to go to robots, look at Chiles and their touch screens, so it probably wouldn't affect productivity or anything, so what's the harm. It's not like there's any point in not playing videogames all day anyways. The only reason anybody seems to do anything these days like go to college or get a job is so that they can just make money and buy the next overpriced phone or videogame, so maybe a world where everyone's plugged in would be an improvement.

White Bear

Even though most people prefer this episode over most episodes it isn't one of my favorites. However, it starts off with a woman and she doesn't know what's going on or where she is. This episode is quite bizarre but I understand the reasoning. This woman goes out to search for help when she finds people inside houses recording her. That's when a man pulls up in a blue car with a ski mask with a upside down white "Y" identical to the tattoo her former fiance had on the back of his neck. This woman was the accomplice to her fiance in the killing off a little girl who had a white bear hence the name of the park and the episode. I wonder if she will figure out more about her past the further the park continues. The whole thought of the park kind of worries me only because it seems a little inhumane the way they erase her memory. But once again I understand their reasoning behind wanting to make her suffer.

San Junipero

     San Junipero was an interesting episode of Black Mirror. It was about two women, one nerdy white woman, and a hipster black woman. Kelly, the black woman had only cared about relationships with no strings attached at the time. While Yorkie quickly fell in love with Kelly. Kelly left Yorkie behind because she did not want to be attached to Yorkie. She was afraid of feelings. These encounters took place within every week and ending at twelve midnight after the first encounter. When Yorkie had finally found Kelly she guilt tripped her into admitting why Kelly had seemingly abandoned her. As it turns out, the world the two girls found themselves in was virtual. It is a world built by a computer that gives humans a world beyond death. Kelly had admitted to Yorkie, after impulsively marrying her after finding out Yorkie had been paralyzed for forty years due to a car crash, that her daughter and husband had died. Kelly admitted this during their fight after their marriage. Hurt and full of despair, Kelly had driven off leaving Yorkie alone to wallow in despair. Kelly drives away and deliberately crashes her jeep. Yorkie arrives, but Kelly weekly times up at that very moment, and her virtual younger body disappears. Kelly has her body buried alongside her husband and daughter. However, she also opts to have her mind uploaded to San Junipero, where it happily reunites with Yorkies. The consciousness of Yorkie and Kelly are installed in massive server room, where robots maintain those who live in San Junipero. Personally I think that its a way to cheat death. Humans always want to live forever by any means of immortality. Even if it means an empty existence of the damned inside a cold cruel machine, they will never have souls.

San Junipero

This episode in the series Black Mirror, a young girl goes into a bar in a city called San Junipero. She meets another girl and they begin to fall in love. As the story progresses, they fall in love and you learn that this city is actually just a computer program that elderly people can go into before and after they die. Yorkie and her lover both end up in San Junipero after they die and live out the rest of eternity to be together. I thought as a show all together it was pretty good but it sparked many questions that were left unanswered. Are there multiple cities like this to stop it from getting to crowded or is it just one place everyone goes? Also, how big is this city? Finally, what would it cost to enter this false reality? This is a good idea for those who do not believe in an afterlife. I feel as though this is something that could benefit many older people who are beginning to become sick so they can relive or remember things from their past.

White Bear

In the following episode of the series Black Mirror, a lady wakes up not remembering anything. When she stands up, she notices pills all around her and she begins to explore the house she found herself. As she goes outside, she begins to ask people for help but they say nothing and are continuously video taping her with their cellphones. As she continues to explore she runs into two people who then begin to explain to her what has been going on. As they progress through the story, they run into many situations in which they could die. This eventually leads them to White Bear where the woman mentioned at the beginning of the movie finds out that this is a way to make her pay for kidnapping and killing a child. She must suffer by going through the same things she put the child through and at the end of the night the erase her memory and it begins again everyday. I feel as though this is not a good way to punish her for the crime she committed. She may have to relive this everyday now but she does not have time in order to think about what she did and feel any sort of remorse. She may have a hour or two to think about it when they take her back to start the process all over again but during that time she may instead be thinking about having to relive this again. Her entire focus is no longer on what she did but rather what will happen to her. She no longer has the chance to feel guilt for what she did or try to make her wrong right.
Black Mirror's "San Junipero" episode was really quite interesting.  It starts out with this girl named Yorki awkwardly walking down the crowded city streets in San Junipero in the year of 1987 and then she walks into this bar/dance club.  She is pretty socially awkward but then she sees this arcade video game and excitedly walks over and starts playing it.  This dude starts to hit on her and asks if she wants to play this action car game.  The red car wrecks on the screen and she freaks out and frantically walks away.  She then meets this girl named Kelly and Kelly asks Yorki to dance with her.  Yorki reluctantly starts dancing but then things get heated and Yorki awkwardly walks away and out of the club.  One week later the same thing happens and Yorki is back at the club and she sees Kelly again.  She decides to have a fling with Kelly this time and she feels a connection.  Again, a week later the same thing keeps happening but one time, Yorki does not see Kelly.  Yorki decides to go to the infamous Quagmire bar and ask around for Kelly.   One dude she asked tells her to check another decade.  So Yorki goes to the 2000s and finally finds Kelly.  They end up having a relationship and Kelly tells Yorki of her husband that passed over after them being married for 49 years.  Kelly then tells Yorki that she is passing over soon and does not have time.  Kelly then crashes her red car just like the video game but then she ends up being alright.  They get married and pass over together in the end of the episode.  Everyone either has the option of being a full-timer and staying in San Junipero forever, or they pass over to the next life, whatever that may be.  

Friday, April 21, 2017

"San Junipero"

A major theme in today's society is youth. We often see older people trying to keep up with the trends of today's young people. Aging gracefully isn't a major concern anymore. In fact, we hardly want to age at all. In "San Junipero", as people approach the obstacle of death, they are able to participate in trial runs of a virtual reality that allows them to become young again and gives them essentially freedom to do whatever they want and have whatever they want. San Junipero is a party town, and people are able to decide how much pain they do or don't want to feel. Then, when they die they have the option of dissolving into the virtual world full time. I see San Junipero as a coping mechanism for people who aren't quite ready to die and, also, a coping mechanism for people who want to be young again. The natural cycle of life should teach us to accept that we are young, then we are old, and then we die. However, San Junipero allows people to refute this idea and feed into the "id" component of their personality. I also think that San Junipero is almost an unhealthy option for people to have because it promotes unhealthy ways of dealing with the circle of life.

Wednesday, April 19, 2017

White Bear

What is Justice and who has the right to determine what is just and what is inhumane. I don't want to justify that what the Girl did didn't deserve punishment but I disagree with what she recives. My biggest problem was that the girl was being punished more than one time and that she did not know what she was being punished for. Also my biggest disagreement was that they were using her to get money and instead as doing it to have justice they were doing just as a profit.  What right did they have to use her as a profit. Is that what they called justice. Who decided that that was justice, and not immoral. It was scary to see how people behaved when they wanted justice, but it seemed like it was more of revenge then justice. Revenge is not justice, revenge is an inner demon that satisfy their anger.

Tuesday, April 18, 2017

Inhumane Justice

At what point does a person lose his or right to be seen as and treated as a human being? Does committing a crime mean that a person should no longer be treated as a human being and deserves instead to be treated as an animal or object? In our society, it is the case that committing a crime results is the confiscation of his or her human status. Often times, criminals are abused, enslaved, and feed awful foods. Prison systems kills the person physically, mentally, emotionally, and so on. In White Bear, the woman's human pass was revoked; she was caged, tormented, and deprived of nutrient. She was paraded and treated as an object rather than a human.

Now I want to be clear that I understand that one is treated very differently when they are being punished, but am I not a person? The argument for such treatment is that people who commit crimes, such as murder are not acting as human beings but as animals or non human. To suggest such an argument is to deny that to kill is not a "human" thing; on a simple level we kill animals for sport or survival. In the Bible, people killed as a commandment or to move into new lands like in the Exodus story. Of course I am not arguing that murder, killing is wrong but to question rather committing such an act dehumanizes an individual. Is a person who kills still not a person? When the woman recorded the murder of the girl, her generic make up did not change to where she became something other than human. For some reason, in our society, it is acceptable to torment people if they commit a crime no matter how horrendous. Criminal: a person who has committed a crime; person: a human being regarded as an individuals. So if criminals are human beings why is there inhumane justice?

Monday, April 17, 2017

Eye for an Eye and the White Bear

The White Bear program where the lady is forced to live through the same crime she committed over and over can in no way be seen as justice. Justice, though harsh, is still fair. It would be one thing if she was forced to live through it once and was then sent to jail to carry out her sentence. For is an eye for an eye when carried out by a justice system and not individuals the best form of justice? Keep in mind that justice is synonymous with fairness, it is no more than what the criminal did to the victim and no less. Would it not be fair that if Johnny was attacked by Riley and sliced Johnny's face making Johnny blind, then a form of justice would be blinding Riley. Though eye for an eye would get complicated in cases of rape when it would become difficult for the punishment to be carried out because the justice department would have to then employ a rapist in order to mediate the punishment because it would be ethically questionable to put such a crime on the head of any man or woman, the guilt would be tremendous, and as we've seen with soldiers that come home from war that bring the war home with them, it could possibly have a negative impact on their lives. The crime would then spread as it impacted the punisher and his life, but then employing a known rapist, someone who would not be affected mentally by the act, would be, for lack of a better descriptor, feeding the beast, giving in to the criminal. While it could be argued that this known rapist is fulfilling a necessary position in society as the punisher of rapists, hiring the rapist to rape would surmount to rewarding the act of raping and could encourage future rapists to act on their urges (don't worry I gagged while it came out) which would go against one of the other factors of Eye for an Eye punishment, fear. The person must follow the golden rule of "treat others how you would want to be treated" because if not they get treated the way that they treated others. Of course, you do find the occasional deviant that wants to be harmed the same way that he harms others, but we'll ignore that for now and possibly forever. But the point is Eye for an Eye relies on the fear that if I kill someone, I get killed. If I brutally harm someone, I get brutally harmed the same way. Then when we get these crimes that require someone to commit an act as heinous as rape, we cannot ethically make someone commit the act and then live with it, but then the fear of retribution is gone. The whole system would fall apart without that fear, and the justice system must act justly, other wise what's the point. The argument would be hiring someone to rape someone would be unjust because either (A) the person enjoys raping other people or (B) the person does not enjoy raping other people and then has to live with the guilt and shame that comes from the act along with any diseases the person being punished has. We've established that eye for an eye would not work for this, and it can be agreed that preventative measures would be best, but there will always be rapists, and they must always be punished. The modern idea for punishment is lock them away so that they can't harm anybody else while the reflect on their actions, but they're always released eventually, those that don't get life sentences which is uncommon for one time events that aren't murder. This system has very mixed results after all many criminals go on to repeat the same crime. There must be a system that (A) provides fear for any future perpetrators, (B) provides treatment and real rehabilitation for perpetrators, while also (C) preventing any future crimes committed by the perpetrator especially those of the same type. (A) and (C) will at times go together hand in hand.

Saturday, April 15, 2017

Black Mirror's "White Bear" episode was very interesting and disturbing.  It starts out with this lady waking up and she sees a strange symbol on the computer screen.  She does not know where she is nor does she remember anything about herself or her past.  She walks outside and there are people hiding out everywhere recording her with their phones and she has no clue what is going on.  Then this guy in a mask starts chasing after her with a gun and the people do not even try to help her- they just remain recording her.  She then sees these two people by a gas station and they run inside and help her out.  The man of the two people ends up getting shot by the guy in the mask and the woman runs off with the other lady.  She protects her for the majority of the time until the end when she turns her back on her and begins publicly working with the masked guy.  Towards the end, the woman gets cornered by everyone that she has encountered and they capture her and tie her up in a chair and the curtain comes up.  It turns out, the white teddy bear that the captured lady had kept seeing in her mind, was this little girl's stuffed animal.  The now unmasked man reveals to her and the crowd that the lady recorded her "daughter," whom was actually reported a missing child, being killed by the lady's boyfriend.  The whole episode was like a play. The whole set-up is a White Bear Justice Park.  They refresh and erase the bystander murderer lady's brain every day and have the audience be the recorders, so that she will feel what it is like to be the little girl being recorded while being hunted to kill.

Thursday, April 13, 2017

The Panopticon

     The Panopticon is a type of prison system developed by Jeremy Bentham. It is a system where there is one tower in the middle, with a bright blinding light shining like a beacon to the surrounding cells around the permitter. It is set up so that the prison inmates can not tell whether or not the prison warden is watching their very move, from how they breath to even how they sleep. The entire purpose is to train and condition the prison inmates into paranoia. The end result will either have the victims go mad with paranoia and fear that they are always bing watched, which is the point of the system, or the alternative to have them know their place. That they cannot so much as think without supervision is a good way to train and discipline the victims to obey under the iron grip of authority. It is a brilliant way to have absolute control over the victims. A downside to this system is if the victims fall prey to madness and insanity, unforntatly that is not something anyone can control. It is a wildcard in the game of absolute power and dominance. Personally, I would not use this system, despite its effectiveness. I would not because, while it is conveint to control someone by manipulating their psyche, there is something morally wrong about it. In order to use it, the one's in power have to have a densensitivity to social and moral qualms to use the system to its full potential. Although i personally am fond of control, I prefer unpredictability in a person. To subcomb the prisoners into paranoid hysterics is not only wrong, but I would want them to not break under the pressure. I would want them to fight back, bring despair upon their captors and drive them into madness, as long as its unprdictable, thats what makes humans interesting and fun.
 

Sunday, April 9, 2017

The Panopticon

The Panopticon is an institutional building imagined by Jeremy Bentham. The purpose of the Panopticon is for subjects to be observed without being able to see their observer. This, in turn, will make all of the subjects behave as if their being watched even though they can never be sure whether or not someone is watching them. While most people see the idea of the Panopticon as the perfect prison system, Bentham originally imagined the Panopticon as an institution that could be used for multiple things. Even though no one has ever officially built the Panopticon, the idea of it is something that we use everyday. In various aspects of our lives authority figures instill a fear of doing the wrong thing by using a few people as examples of what will happen when you do the wrong thing. This makes sure that even when people are not being watched, they still behave as if they are because they can never be sure whether or not someone is waiting to reprimand them.

Friday, April 7, 2017

Emotional Behavior

When you got hurt as a child, did you cry? Is crying a learned or natural emotional behavior? It's definitely natural; even as infants we cry or display some type of emotional behavior. So then what is the difference between learned and natural emotional behavior? To answer this question, let us refer to the episode of black mirror, particularly the scene when Martha told Ash 2.0 to jump.

Was the emotion learned or natural? I say natural in the sense that Ash 2.0 was programmed to give emotional reactions. Similarly, we are programmed to have emotional responses. The main difference is that we're programmed by a brain and firing of hormones while robots are programmed through mother boards and firing of electrodes. Thus, emotional behavior is normal for both humans and AI. An issue arises when either a human or robot does not display the right emotion at the right time, yet failure to act appropriately does not negate the naturalness of emotion; it just simply means that either the human or bot have not been taught when to display emotion. 

Tuesday, April 4, 2017

"Be Right Back", one of the surprisingly cheerier and happier ending episodes of Black Mirror, brings to light the concept of robot rights. For us in 2017, this concept might seem a little irrelevant as it seems ridiculous to us that a machine would have emotions and the capacity to suffer; however, only a few decades ago, animals were widely thought to be incapable of suffering as well. In fact, as a form of comparison, it is easier for us to imagine an entirely sentient machine within our lifetime as quantum computers emerge into the mainstream and we carry early prototypes of artificial intelligence in our moderately low power phones (Siri/Google assist), than it was for the humans of 1901 who's earth had never even seen the existence of an airplane and thought it unimaginable for a person to stay in the air against gravity, only to see man land on the moon in their own lifetime less than 70 years later. Our mainstream computers will no longer be binary, a 0 or a 1 to transfer streams of information, but will run on complex and moving characters with varying values. An analogy for this is that mainstream binary computers for which we know emotion is impossible are like a note
in an envelope, with each letter making up words which makes sense together and transfer a message; quantum computers is like a note in an email in which every letter has a link to an entire note. Being able to fit the entire note in the envelope into a single letter of the email is an example of how much more powerful the new computers could be. In short, we are robots ourselves with quantum computing and poor casing that deteriorates dramatically over time. Our emotions and thoughts and innovations are all a result of chemicals with information and electrical impulses interacting with the brain and its transmitters. Whether robot rights should be a concern as we still struggle with basic rights for some humans is another question with various moral angles and topics of self preservation, but whether it will be possible and whether legislation will actually be debated is answered by when.

What does it mean to be a person?


            What does it mean to be a person? Is it how one looks? Is it based on how one thinks? Is there some sort of soul or spirit within us that makes us a person? Or is it the emotions that guide us? Or the intellect that humans are so proud of? We can knock off looks because having severe deformities or injuries does not take away someone’s personhood, and we have to allow for the fact that humans might not be the only beings able to claim personhood (aliens might exist). It would also be incorrect to base it solely off intellect since idiots exist, and we now have artificial intelligence. What about the way people think, powered by the brain? Still the brain is just a cluster of nerves that fire off signals, and can be mimicked by machines and is often the basis used for the design. That rules out the brain. What about emotions though? Could we really base something as important as personhood on an indiscernible quality? Something that cannot be measured from the outside? How can one know whether or not an emotion is real? Certainly not by facial expressions, especially since they can be faked and mimicked by a good actor or someone programmed to do so, but just because something cannot be easily measured and outwardly falsified does not mean that it can be discarded as a means of discernment between what is really a person and not a person. What it really boils down to is that truth cannot be discarded because someone can fake it. True emotions are what separate people from machines. In the Black Mirror episode, the robot Ashebot could only mimic human emotions. He did not suddenly learn emotion based on how he was supposed to feel in a situation. He only followed orders. There were no true emotions. A person does not learn emotions; they merely learn why something is sad. Take for example a child at a funeral, the child, a boy for convenience, does not understand why everyone is sad not because he has yet to learn real emotion, but because he does not yet understand what it means to be dead. After death is explained, the child is not sad because he has been taught how to feel sad, but because he understands that he will never see his grandmother again. The emotions are already there, unlike for Ashebot where he needed someone to explain how to express fear. There was no true fear, but he was programmed to behave like the original Ashe and to fill any gaps in his knowledge. Therefore in the case of machines versus people, emotions can be used to determine whether or not the machine is a person because even sociopaths have been known to know the more intense emotions, such as anger. There still remains the case that emotion cannot separate humans from animals. There is also an argument for the case that people contain eternal souls which rules out machines and animals, but that argument whether true or false will not hold up to atheist who plays a part in legislating what makes a person a person, so it will be ignored for now. There are no qualities here that can rule out both animals and machines from being a person, but a very simple answer could be that people are often irrational. Whether this irrationality is our enjoyment of the arts or our willingness to spend all our rent on a chance to meet our favorite celebrities. There is no basis in survival for this, no instinctual desire for this, yet people will make decisions without thinking or processing the information. We’ll change our plans just to stop and sit down because it feels nice, we’ll go without food for days to honor our gods even though this goes against our own survival instincts, and you can’t program the willingness to forego everything just to sit in the sun and use the excuse that it felt nice outside, so I forgot to go get food. There may be an algorithm for designing clothes, but there’s no algorithm for desiring comfort or wanting to feel pain if you’re into that kind of stuff.

Black Mirror

In the episode of Black Mirror entitled "Be Right Back", Martha and Ash were a couple and Martha loses Ash due to an accident. She becomes distraught and has trouble dealing with the fact that he is no longer in her life. Due to the technological advances that the country has seen she has ends up with an app that would emulate everything Ash would say based on information gained from his social media. She became very attached to this Ash 2.0 and when she broke her phone she became very upset because she thought she would never be able to talk to this Ash 2.0. Due to this, she stayed attached to this Ash and was unable to move on. Then the Ash 2.0 informed her about an option that gave her the opportunity to have a robotic model of Ash that was identical to him physically but only matched his personality based on social media posts. The problem with this is the fact it would not be like him because it couldn't copy his emotions based on what he wouldn't post on social media. She slowly grows to dislike this robot version of Ash and asks him to kill himself but was unable to actually have him go through with it. She then puts him in the basement to live out the rest of his days because she can't stand the fact of him being gone forever. I don't agree with this completely because you will never have the ability to grow as a human being. When things like this happen, it gives the person an opportunity to change and grow emotionally and ethically. When I was ran over by a semi, I was able to learn and grow from that experience and I change emotionally, psychologically, and physically. I had the opportunity to grow and learn about myself and because of this I am a better me. If you never have to experience life changing moments you will never have the opportunity to grow and that is why I do not like this reality of never having to experience life changing moments.

Black Mirror

The episode we saw called "Be Right Back" and it was pretty messed up. It's basically how a couple move in together and the the guy dies and the girl is left alone pregnant. Then one of here friends tells her about this app that looks at a persons post and acts like that person. The girl was using it as comfort for the lost of her boyfriend, But the problem is that she started to get to attached to the app that she bought a bionic-robot and she got attached to it. The problem was that she wanted to replace her boyfriend Ash. So she was expecting the robot to do the same things and have the same memories that ash had. At the end it was ironic that ash 2.0 was placed in the basement with the pictures of his brother. The girl did the something that Ash mother did when Ash brother died. She wanted to forget about him so she put all the pictures of him in the basement. Ash's girlfriend did the same with Ash 2.0.

"Be Right Back"

In "Be Right Back", a widowed woman decides to abuse technology and move from talking to a computer generated version of her husband to purchasing a cyborg version of her husband. Martha, the widowed woman, made this decision because she was trying to control her grief. However, she may have only intensified it instead of making it better. Towards the end of the episode, Martha comes to the conclusion that this cyborg version of her husband is not the same as her real husband, and she commands the cyborg to jump off of a cliff. At this moment, Martha realizes that the cyborg is not really human. Humans have distinct characteristics and abilities. Her husband, Asher, was able to be himself by himself. He didn't take in information from social media to become the person he was, but the cyborg wasn't able to just be Asher. It had to think about exactly what the real Asher would have done, and even then, it wasn't always sure. Martha knew that the cyborg wasn't really a person. It was reflective of parts of a person. She really wanted Asher back, and all the cyborg did was remind her of pieces of him. She made changes to the house in order to not be reminded of the pieces of him, so having the cyborg around was like taking two steps backwards away from the strides she was really trying to make. The fact that the cyborg was not completely Asher made Martha miss her husband even more, and her initial attempt failed.

Monday, April 3, 2017

I had never heard of the show "Black Mirror" until Professor Curtis mentioned it in class one day.  Now that I have watched an episode of it, particularly the episode, "Be Right Back," I can now have the view that it is a very intriguing and somewhat disturbing series.  In this episode, a young British/Scottish adult couple moves into a new house in the country.  In the start of the episode, the couple is driving to the new house while listening to their favorite tunes on the radio.  The husband talks of his favorite song and starts to joke around and sing it.  Later on, once the couple is settled in, the husband leaves for work one morning and the wife is left at home by herself.  The husband ends up never coming home and the wife starts to worry that somethin terrible has happened.  The police show up at her door and she immediately knows her husband is gone forever.  Her friend talks to her at his funeral and suggests her talking to this piece of technology that can take the role of her husband but she yells at her and refuses.  She starts to get morning sickness as she deals with the loss of her companion.  Little does she know until she curiously uses a pregnancy test, that she is pregnant.  She breaks down because she now has no one to share this moment with.  Her friend went ahead and signed her up for the communicating device tool and the wife hesitantly starts to use it.  The more she talks to the device, the more she starts to feel like her husband is actually there.  The device suggests proceeding to the next step of communicating, that of a robot.  She gives in and helps build the robot that resembles and plays the role of her husband.  When she eventually has her daughter, the daughter is never told that her father is actually dead, and she begins to develop a relationship with the false object.  This to me is psychologically disturbing.  All in all, this episode of "Black Mirror" was something I never expected and hope to never encounter.  

Sunday, April 2, 2017

In "Be Right Back" Martha and Ash have been dating and are living in Ash's childhood home. Ash seems to be completely absorbed in technology during the time we know him. Ash has to take the van back to a dealership (maybe wasn't specified) to get another car. He asked Martha if she would like to go with him but she refused since she had some work that needed to be done. Once Ash never came home Martha decided to call the company to see if the van had been returned. When they said no, that's when the cops came to give her the news.  Fast forward to the funeral and a not very specified character (Sara) comes in to play and tells Martha about a program that takes all posts from the deceased's social media and mimics that person to help loved ones "grieve." Martha hates the idea but Sara signs her up anyway and it starts to take over Martha's life. Martha became so obsessed with this program that she would talk to "Ash 2.0" all day everyday. Then without care of money she bought a robot dummy thing that takes the form of Ash. This robot did not seem to be like Ash at all in my opinion. It's like we said in class there is a public life we have on social media and then there is our private life. But my main question I've been thinking about is what if she went with Ash? Would she still have lived? Would HE still have lived? Would they have both been seriously injured or would the wreck have happened at all? It all reminds me of one of my favorite quotes from a movie Mr. Nobody "You have to make the right choice. As long as you don't choose, everything remains possible." This episode was extremely strange to me but I found it very interesting.