Could an AI feel emotions? Ask Question
Asked 9 years, 5 months ago Modified today Viewed 3k times
Asked 9 years, 5 months ago
15 $\begingroup$ Assuming humans had finally developed the first humanoid AI based on the human brain, would It feel emotions? If not, would it still have ethics and/or morals? philosophy human-like ethics emotional-intelligence emotion-recognition Share Improve this question Follow edited May 8, 2021 at 17:04 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges asked Nov 6, 2016 at 1:51 MountainSide Studios 403 4 4 silver badges 9 9 bronze badges $\endgroup$ 4 4 $\begingroup$ It depends on what you mean by "feeling emotions". I asked a similar question Is the simulation of emotional states equivalent to actually experiencing emotions? on Philosophy SE, and the answer does appears to hinge on the definition of "feeling" (especially when dealing with the concept of "qualia"). $\endgroup$ Left SE On 10_6_19 – Left SE On 10_6_19 2016-11-06 03:30:32 +00:00 Commented Nov 6, 2016 at 3:30 $\begingroup$ On the last part of your question, if it couldn't feel emotion, then it probably couldn't relate to other beings that have them. An inability (or unwillingness?) to consider other people's feelings is the #1 telltale sign of a sociopath. Whether or not a sociopath lacks morals or just has a corrupted moral code is a deep question unto itself, but suffice to say that either way, they end up dealing with others in a selfish manner. In the AI's case it would probably be the former problem, lacking a moral sense (in the case of humans it's probably the latter) but the result's the same $\endgroup$ SQLServerSteve – SQLServerSteve 2016-11-09 04:10:26 +00:00 Commented Nov 9, 2016 at 4:10 $\begingroup$ Machines can't take decisions based on human judgement. ai.stackexchange.com/questions/1354/… $\endgroup$ kvfi – kvfi 2016-11-26 12:17:47 +00:00 Commented Nov 26, 2016 at 12:17 $\begingroup$ Have a look at the very related questions: Can an artificial intelligence suffer? and Can an AI learn to suffer? . $\endgroup$ nbro – nbro 2019-03-31 12:56:18 +00:00 Commented Mar 31, 2019 at 12:56 Add a comment | 11 Answers 11 Sorted by: Reset to default Highest score (default) Date modified (newest first) Date created (oldest first) 5 $\begingroup$ I have considered much of the responses here, and I would suggest that most people here have missed the point when answering the question about emotions. The problem is, scientists keep looking for a single solution as to what emotions are. This is akin to looking for a single shape that will fit all different-shaped slots. Also, what is ignored is that animals are just as capable of emotions and emotional states as we are: When looking on Youtube for insects fighting each other, or competing or courting, it should be clear that simple creatures experience them too! When I challenge people about emotions, I suggest to them to go to Corinthians 13 - which describes the attributes of love. If you consider all those attributes, one should notice that an actual "feeling" is not required for fulfilling any of them. Therefore, the suggestion that a psychopath lacks emotions, and so he commits crimes or other pursuits outside of "normal" boundaries is far from true, especially when one considers the various records left to us from court cases and perhaps psychological evaluation - which show us that they do act out of "strong" emotions. It should be considered that a psychopath's behaviour is motivated out of negative emotions and emotional states with a distinct lack of or disregard of morality and a disregard of conscience. Psychopaths "enjoy" what they do. I am strongly suggesting to all that we are blinded by our reasoning, and by the reasoning of others. Though I do agree with the following quote mentioned before: - Dave H. wrote: From a computational standpoint, emotions represent a global state that influences a lot of other processing. Hormones etc. are basically just implementation. A sentient or sapient computer certainly could experience emotions, if it was structured in such a way as to have such global states affecting its thinking. However, his reasoning below it (that quote) is also seriously flawed. Emotions are both active and passive: They are triggered by thoughts and they trigger our thoughts; Emotions are a mental state and a behaviourial quality; Emotions react to stimuli or measure our responses to them; Emotions are independant regulators and moderators; Yet they provoke our focus and attention to specific criteria; and they help us when intuition and emotion agree or they hinder us when conscience or will clash. A computer has the same potential as us to feel emotions, but the skill of implementing emotions is much more sophisticated than the one solution fits all answer people are seeking here. Also, if anyone argues that emotions are simply "states" where a response or responses can be designed around it, really does not understand the complexity of emotions; the "freedom" emotions and thoughts have independently of each other; or what constitutes true thought! Programmers and scientists are notorious for "simulating" the real experiences of emotions or intelligence, without understanding the intimate complexities; Thinking that in finding the perfect simulation they have "discovered" the real experience. The Psi-theory seems to adequately give a proper understanding of the matter. So I would say that the simulation of emotional states "is" equivalent to experiencing emotions, but those emotional states are far more complex than what most realize. Share Improve this answer Follow edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge $\endgroup$ Add a comment | 9 $\begingroup$ There is much discussion in philosophy about inner language and the ability to perceive pain (see Pain in philosophy article). Your question is in the area of philosophy and not science. If you define emotion as some state then you can construct simple automata with two states (emotion vs no-emotion). It can be a very complicated state with degrees of truth (percentage of emotion). Basically, to mimic human emotion you need to make a living human-like organism, and still with todays understanding and technology you will not be able to recognize emotion in it. The only thing you can do is trust when it says "I'm sad". Now we are in the area of the Turing test, which is again philosophy, and not science. Share Improve this answer Follow edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge $\endgroup$ 0 Add a comment | 6 $\begingroup$ It is certainly possible for AI to theoretically feel emotion. There are, according to Murray Shanahan's book The Technological Singularity , two primary forms of AI: 1) Human based AI - achieved through processes such as whole brain emulation , the functioning of human based AI would likely be indistinguishable from that of the human brain, and, as a consequence, human based AI would likely experience emotion in the same manner as humans. - 2) AI from scratch - with this form of AI, based on machine learning algorithms and complex processes to drive goals, we enter into uncharted territory as the development of this form of AI is inherently unpredictable and unlike anything we observe in the biological sample space of intelligence we have access to. With this form of AI, there is no telling if and how it could experience emotion. As the question references the former, it is very likely that human-based AI would indeed experience emotion and other human-like characteristics. Share Improve this answer Follow answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges $\endgroup$ Add a comment | 5 $\begingroup$ Assuming an AI was built out of a mechanical husk, mirroring the human brain exactly; complete with chemical signals and all. An AI should theoretically be capable of feeling/processing emotions. Share Improve this answer Follow answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge $\endgroup$ 4 2 $\begingroup$ Are you the Apple's Siri? $\endgroup$ skrtbhtngr – skrtbhtngr 2016-11-13 09:55:24 +00:00 Commented Nov 13, 2016 at 9:55 1 $\begingroup$ Dualists would argue that even then, an AI wouldn't be able to feel emotions, see the philosophical zombie argument and the problem of Qualia . $\endgroup$ Alex S King – Alex S King 2016-11-26 07:53:16 +00:00 Commented Nov 26, 2016 at 7:53 $\begingroup$ You should differentiate the ability to process emotions and mimics. Emotions are a much bigger deal than just chemical signals. It's also a matter of perception. $\endgroup$ kvfi – kvfi 2016-11-26 12:13:48 +00:00 Commented Nov 26, 2016 at 12:13 $\begingroup$ Although I agree with @kvfi that it is also a matter of perception(which depends on lot of factors), I believe with the current pace of advancements in the field of AI, as @Siri said, an AI should be able to feel emotions by taking into account and processing every piece of data related to it. $\endgroup$ SE_User – SE_User 2016-11-28 08:28:54 +00:00 Commented Nov 28, 2016 at 8:28 Add a comment | 3 $\begingroup$ Emotions are a factor in humans having ethics/morals only because they are a factor in all human learning and decision-making. Unless you are duplicating a human being exactly, there is no reason to think that an AI will learn the way a human learns, or make decisions in the same way a human makes decisions. Therefore, whether it "feels emotion" just like we do, or whether it simply responds to outcomes "cost is greater = don't go there", the outcome of ethical BEHAVIOUR could be achieved. An AI could behave perfectly ethically without any need for feeling empathy, shame, etc. You could also argue that a lot of UNETHICAL behaviour in human beings is driven by emotions, too, and that an unemotional but ethical AI may well do a better overall job than a human being. Share Improve this answer Follow answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges $\endgroup$ 1 $\begingroup$ An unemotional AI human-duplicate doesn't have the same potential as a human being. It may be more challenging to handle emotions but they are an additional resource. $\endgroup$ user72620 – user72620 2025-08-01 14:02:33 +00:00 Commented Aug 1, 2025 at 14:02 Add a comment | 2 $\begingroup$ This question is more the province of philosophy of mind than of AI, here are some detailed answers to your question from the philosophy SE: Is simulating emotions the same as experiencing emotions? , and What is the problem with physicalism? . For the record, the accepted answer (by Siri) to the question is not entirely correct (The position in that answer corresponds roughly to John Searle's view on the question, and his is a minority view): Dualists would argue that even with a perfect replication down to the chemical level of brain interactions, an AI still wouldn't experience emotions, as it lacks the purely mental substance/properties that make a mind and not a machine. On the completely opposite side of the spectrum, functionalists would answer that such a perfect replication is overkill: even a suitably programmed digital computer can experience emotion, particularly if one equips it with higher-order and self-referential states. Share Improve this answer Follow edited Apr 13, 2017 at 12:42 Community Bot 1 answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges $\endgroup$ 1 $\begingroup$ It is definitely a philosophical question in regard to simulation of emotions, but removing anthropomorphism, and focusing on fundamental, philosophical principles, it can shown to be mathematical: ai.stackexchange.com/a/2877/1671 . To be truly felt, the emotion would have to arise out of something machines consider important, such as optimization, computing resources, calculability/intractability, and equilibria. However, it surely becomes philosophical again in terms of consciousness, or lack thereof, of the system. [Excellent answer, btw!] $\endgroup$ DukeZhou – DukeZhou 2017-03-01 16:12:46 +00:00 Commented Mar 1, 2017 at 16:12 Add a comment | 1 $\begingroup$ Well, it depends of the level of the AI. You can create an AI super autonomous with deep learning capabilities and so on, but in the robotic type only. If you'd create an AI like EVA in the Ex-Machina movie, humanoid form, deep neural transmissions and with cognitive dissonance, then it could feel. The 'AI' problem its not the chemical and neural transmissions, its the consciousness. Share Improve this answer Follow answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges $\endgroup$ Add a comment | 1 $\begingroup$ Yes and no. If you fully simulate a human brain and all of its functions, it would probably be able to feel emotions very similar to the way we do. But we don't have enough capabilities and knowledge to do that, and maybe we could find a "shortcut" - a process that is intelligent without simulating a whole brain. In this case, emotions would probably represented by data values which say "this is good (make it happen again!)", or "this is bad (avoid it!)". This is just a very basic example (there are obviously many more emotions), but it would have a similar function and the AI would have similar solutions to the ones we have. But we don't know - and probably no one ever will know - if this data value 'bad' "feels" the same way for the AI the according emotion would feel to us. Share Improve this answer Follow answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges $\endgroup$ Add a comment | 1 $\begingroup$ You first need to express emotions, you can do that without the aid of AI, and then you need someone to perceive that expression and empathize with it. If no one is there to see it, or if I am psychopath, I would probably say it doesn't have emotions. and for that, it is irrelevant/subjective. If you can empathize with characters in movies who "act" emotions, then you get my point. Share Improve this answer Follow answered Nov 27, 2016 at 1:36 user3874 $\endgroup$ Add a comment | 1 $\begingroup$ IMHO Definitely, yes! Everything that person feels (physically or mentally) can be discovered by chemical signals processing in his body or brain. If we understand the policy and nature of such signals, we can program it. There are a lot of pseudo-psychology and psychology works on this sphere, if you interested, I can suggest you: Cognitive Psychology (Robert L. Solso) describes cognitive apparat of human's mind in a simple words; The Psychology of Emotions (Carroll E. Izard) thorougly describes every kind of emotion by its looking on the human (both child and adult) face, low-level cognitive mechanism, related or adjacent emotions; Books by Paul Ekman ("Telling Lies", "Emotions Revealed", "Unmasking the Face") practical detecting of human emotions by microexpressions language on face and body. Upd 2023-01-05, to give more details A person forms their own emotions through their own experience, social norms and other factors. Every emotion has its own strength and can vary from person to person. People behave in a different ways on the similar problems. You can learn to control your emotions and not allow some of them to progress (like anger - as society teaches us to do). Need to mention that there are a lot of psychological problems that people could have like learned helplessness - that shows us that our experience can make us feel strange emotions as reaction to some triggers. And this is example that emotions like everyting else could be learned. More of that - you can control the emotions your AI will have. Of course it could be the question if AI needs emotions and what benefits it can have from them. Share Improve this answer Follow edited Feb 4, 2023 at 22:04 answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges $\endgroup$ Add a comment | 0 $\begingroup$ Could an AI feel emotions? Anthropic published an interesting blog post on it yesterday (2026-04-02), tldr is the LLM they studied does not feel emotions, but it can develop internal functional analogues of emotions in human that influence its behavior. Anthropic's research shows that large language models contain internal patterns, sometimes called “emotion vectors”, that correspond to concepts like happiness, fear, or desperation. These are not subjective experiences; they are computational states that systematically affect outputs. This aligns with a functional view of emotion: emotions are not defined by “what they feel like ” but by what they do—mapping inputs to adaptive response. For more details, see Anthropic's: blog post paper Emotion Concepts and their Function in a Large Language Model (2026-04-02). YouTube video . Share Improve this answer Follow answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges $\endgroup$ Add a comment | You must log in to answer this question. Start asking to get answers Find the answer to your question by asking. Ask question Explore related questions philosophy human-like ethics emotional-intelligence emotion-recognition See similar questions with these tags.
15 $\begingroup$ Assuming humans had finally developed the first humanoid AI based on the human brain, would It feel emotions? If not, would it still have ethics and/or morals? philosophy human-like ethics emotional-intelligence emotion-recognition Share Improve this question Follow edited May 8, 2021 at 17:04 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges asked Nov 6, 2016 at 1:51 MountainSide Studios 403 4 4 silver badges 9 9 bronze badges $\endgroup$ 4 4 $\begingroup$ It depends on what you mean by "feeling emotions". I asked a similar question Is the simulation of emotional states equivalent to actually experiencing emotions? on Philosophy SE, and the answer does appears to hinge on the definition of "feeling" (especially when dealing with the concept of "qualia"). $\endgroup$ Left SE On 10_6_19 – Left SE On 10_6_19 2016-11-06 03:30:32 +00:00 Commented Nov 6, 2016 at 3:30 $\begingroup$ On the last part of your question, if it couldn't feel emotion, then it probably couldn't relate to other beings that have them. An inability (or unwillingness?) to consider other people's feelings is the #1 telltale sign of a sociopath. Whether or not a sociopath lacks morals or just has a corrupted moral code is a deep question unto itself, but suffice to say that either way, they end up dealing with others in a selfish manner. In the AI's case it would probably be the former problem, lacking a moral sense (in the case of humans it's probably the latter) but the result's the same $\endgroup$ SQLServerSteve – SQLServerSteve 2016-11-09 04:10:26 +00:00 Commented Nov 9, 2016 at 4:10 $\begingroup$ Machines can't take decisions based on human judgement. ai.stackexchange.com/questions/1354/… $\endgroup$ kvfi – kvfi 2016-11-26 12:17:47 +00:00 Commented Nov 26, 2016 at 12:17 $\begingroup$ Have a look at the very related questions: Can an artificial intelligence suffer? and Can an AI learn to suffer? . $\endgroup$ nbro – nbro 2019-03-31 12:56:18 +00:00 Commented Mar 31, 2019 at 12:56 Add a comment |
15 $\begingroup$ Assuming humans had finally developed the first humanoid AI based on the human brain, would It feel emotions? If not, would it still have ethics and/or morals? philosophy human-like ethics emotional-intelligence emotion-recognition Share Improve this question Follow edited May 8, 2021 at 17:04 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges asked Nov 6, 2016 at 1:51 MountainSide Studios 403 4 4 silver badges 9 9 bronze badges $\endgroup$ 4 4 $\begingroup$ It depends on what you mean by "feeling emotions". I asked a similar question Is the simulation of emotional states equivalent to actually experiencing emotions? on Philosophy SE, and the answer does appears to hinge on the definition of "feeling" (especially when dealing with the concept of "qualia"). $\endgroup$ Left SE On 10_6_19 – Left SE On 10_6_19 2016-11-06 03:30:32 +00:00 Commented Nov 6, 2016 at 3:30 $\begingroup$ On the last part of your question, if it couldn't feel emotion, then it probably couldn't relate to other beings that have them. An inability (or unwillingness?) to consider other people's feelings is the #1 telltale sign of a sociopath. Whether or not a sociopath lacks morals or just has a corrupted moral code is a deep question unto itself, but suffice to say that either way, they end up dealing with others in a selfish manner. In the AI's case it would probably be the former problem, lacking a moral sense (in the case of humans it's probably the latter) but the result's the same $\endgroup$ SQLServerSteve – SQLServerSteve 2016-11-09 04:10:26 +00:00 Commented Nov 9, 2016 at 4:10 $\begingroup$ Machines can't take decisions based on human judgement. ai.stackexchange.com/questions/1354/… $\endgroup$ kvfi – kvfi 2016-11-26 12:17:47 +00:00 Commented Nov 26, 2016 at 12:17 $\begingroup$ Have a look at the very related questions: Can an artificial intelligence suffer? and Can an AI learn to suffer? . $\endgroup$ nbro – nbro 2019-03-31 12:56:18 +00:00 Commented Mar 31, 2019 at 12:56 Add a comment |
$\begingroup$ Assuming humans had finally developed the first humanoid AI based on the human brain, would It feel emotions? If not, would it still have ethics and/or morals? philosophy human-like ethics emotional-intelligence emotion-recognition Share Improve this question Follow edited May 8, 2021 at 17:04 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges asked Nov 6, 2016 at 1:51 MountainSide Studios 403 4 4 silver badges 9 9 bronze badges $\endgroup$
Assuming humans had finally developed the first humanoid AI based on the human brain, would It feel emotions? If not, would it still have ethics and/or morals?
Assuming humans had finally developed the first humanoid AI based on the human brain, would It feel emotions? If not, would it still have ethics and/or morals?
philosophy human-like ethics emotional-intelligence emotion-recognition
philosophy human-like ethics emotional-intelligence emotion-recognition
philosophy human-like ethics emotional-intelligence emotion-recognition
Share Improve this question Follow edited May 8, 2021 at 17:04 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges asked Nov 6, 2016 at 1:51 MountainSide Studios 403 4 4 silver badges 9 9 bronze badges
Share Improve this question Follow edited May 8, 2021 at 17:04 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges asked Nov 6, 2016 at 1:51 MountainSide Studios 403 4 4 silver badges 9 9 bronze badges
Share Improve this question Follow
Share Improve this question Follow
Share Improve this question Follow
Improve this question
edited May 8, 2021 at 17:04 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges
edited May 8, 2021 at 17:04 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges
edited May 8, 2021 at 17:04
edited May 8, 2021 at 17:04
Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges
350 1 1 gold badge 4 4 silver badges 12 12 bronze badges
asked Nov 6, 2016 at 1:51 MountainSide Studios 403 4 4 silver badges 9 9 bronze badges
asked Nov 6, 2016 at 1:51 MountainSide Studios 403 4 4 silver badges 9 9 bronze badges
asked Nov 6, 2016 at 1:51
asked Nov 6, 2016 at 1:51
MountainSide Studios 403 4 4 silver badges 9 9 bronze badges
403 4 4 silver badges 9 9 bronze badges
4 $\begingroup$ It depends on what you mean by "feeling emotions". I asked a similar question Is the simulation of emotional states equivalent to actually experiencing emotions? on Philosophy SE, and the answer does appears to hinge on the definition of "feeling" (especially when dealing with the concept of "qualia"). $\endgroup$ Left SE On 10_6_19 – Left SE On 10_6_19 2016-11-06 03:30:32 +00:00 Commented Nov 6, 2016 at 3:30 $\begingroup$ On the last part of your question, if it couldn't feel emotion, then it probably couldn't relate to other beings that have them. An inability (or unwillingness?) to consider other people's feelings is the #1 telltale sign of a sociopath. Whether or not a sociopath lacks morals or just has a corrupted moral code is a deep question unto itself, but suffice to say that either way, they end up dealing with others in a selfish manner. In the AI's case it would probably be the former problem, lacking a moral sense (in the case of humans it's probably the latter) but the result's the same $\endgroup$ SQLServerSteve – SQLServerSteve 2016-11-09 04:10:26 +00:00 Commented Nov 9, 2016 at 4:10 $\begingroup$ Machines can't take decisions based on human judgement. ai.stackexchange.com/questions/1354/… $\endgroup$ kvfi – kvfi 2016-11-26 12:17:47 +00:00 Commented Nov 26, 2016 at 12:17 $\begingroup$ Have a look at the very related questions: Can an artificial intelligence suffer? and Can an AI learn to suffer? . $\endgroup$ nbro – nbro 2019-03-31 12:56:18 +00:00 Commented Mar 31, 2019 at 12:56 Add a comment |
4 $\begingroup$ It depends on what you mean by "feeling emotions". I asked a similar question Is the simulation of emotional states equivalent to actually experiencing emotions? on Philosophy SE, and the answer does appears to hinge on the definition of "feeling" (especially when dealing with the concept of "qualia"). $\endgroup$ Left SE On 10_6_19 – Left SE On 10_6_19 2016-11-06 03:30:32 +00:00 Commented Nov 6, 2016 at 3:30 $\begingroup$ On the last part of your question, if it couldn't feel emotion, then it probably couldn't relate to other beings that have them. An inability (or unwillingness?) to consider other people's feelings is the #1 telltale sign of a sociopath. Whether or not a sociopath lacks morals or just has a corrupted moral code is a deep question unto itself, but suffice to say that either way, they end up dealing with others in a selfish manner. In the AI's case it would probably be the former problem, lacking a moral sense (in the case of humans it's probably the latter) but the result's the same $\endgroup$ SQLServerSteve – SQLServerSteve 2016-11-09 04:10:26 +00:00 Commented Nov 9, 2016 at 4:10 $\begingroup$ Machines can't take decisions based on human judgement. ai.stackexchange.com/questions/1354/… $\endgroup$ kvfi – kvfi 2016-11-26 12:17:47 +00:00 Commented Nov 26, 2016 at 12:17 $\begingroup$ Have a look at the very related questions: Can an artificial intelligence suffer? and Can an AI learn to suffer? . $\endgroup$ nbro – nbro 2019-03-31 12:56:18 +00:00 Commented Mar 31, 2019 at 12:56
$\begingroup$ It depends on what you mean by "feeling emotions". I asked a similar question Is the simulation of emotional states equivalent to actually experiencing emotions? on Philosophy SE, and the answer does appears to hinge on the definition of "feeling" (especially when dealing with the concept of "qualia"). $\endgroup$ Left SE On 10_6_19 – Left SE On 10_6_19 2016-11-06 03:30:32 +00:00 Commented Nov 6, 2016 at 3:30
$\begingroup$ It depends on what you mean by "feeling emotions". I asked a similar question Is the simulation of emotional states equivalent to actually experiencing emotions? on Philosophy SE, and the answer does appears to hinge on the definition of "feeling" (especially when dealing with the concept of "qualia"). $\endgroup$ Left SE On 10_6_19 – Left SE On 10_6_19 2016-11-06 03:30:32 +00:00 Commented Nov 6, 2016 at 3:30
Left SE On 10_6_19 – Left SE On 10_6_19
2016-11-06 03:30:32 +00:00
$\begingroup$ On the last part of your question, if it couldn't feel emotion, then it probably couldn't relate to other beings that have them. An inability (or unwillingness?) to consider other people's feelings is the #1 telltale sign of a sociopath. Whether or not a sociopath lacks morals or just has a corrupted moral code is a deep question unto itself, but suffice to say that either way, they end up dealing with others in a selfish manner. In the AI's case it would probably be the former problem, lacking a moral sense (in the case of humans it's probably the latter) but the result's the same $\endgroup$ SQLServerSteve – SQLServerSteve 2016-11-09 04:10:26 +00:00 Commented Nov 9, 2016 at 4:10
$\begingroup$ On the last part of your question, if it couldn't feel emotion, then it probably couldn't relate to other beings that have them. An inability (or unwillingness?) to consider other people's feelings is the #1 telltale sign of a sociopath. Whether or not a sociopath lacks morals or just has a corrupted moral code is a deep question unto itself, but suffice to say that either way, they end up dealing with others in a selfish manner. In the AI's case it would probably be the former problem, lacking a moral sense (in the case of humans it's probably the latter) but the result's the same $\endgroup$ SQLServerSteve – SQLServerSteve 2016-11-09 04:10:26 +00:00 Commented Nov 9, 2016 at 4:10
SQLServerSteve – SQLServerSteve
2016-11-09 04:10:26 +00:00
$\begingroup$ Machines can't take decisions based on human judgement. ai.stackexchange.com/questions/1354/… $\endgroup$ kvfi – kvfi 2016-11-26 12:17:47 +00:00 Commented Nov 26, 2016 at 12:17
$\begingroup$ Machines can't take decisions based on human judgement. ai.stackexchange.com/questions/1354/… $\endgroup$ kvfi – kvfi 2016-11-26 12:17:47 +00:00 Commented Nov 26, 2016 at 12:17
2016-11-26 12:17:47 +00:00
$\begingroup$ Have a look at the very related questions: Can an artificial intelligence suffer? and Can an AI learn to suffer? . $\endgroup$ nbro – nbro 2019-03-31 12:56:18 +00:00 Commented Mar 31, 2019 at 12:56
$\begingroup$ Have a look at the very related questions: Can an artificial intelligence suffer? and Can an AI learn to suffer? . $\endgroup$ nbro – nbro 2019-03-31 12:56:18 +00:00 Commented Mar 31, 2019 at 12:56
2019-03-31 12:56:18 +00:00
11 Answers 11 Sorted by: Reset to default Highest score (default) Date modified (newest first) Date created (oldest first) 5 $\begingroup$ I have considered much of the responses here, and I would suggest that most people here have missed the point when answering the question about emotions. The problem is, scientists keep looking for a single solution as to what emotions are. This is akin to looking for a single shape that will fit all different-shaped slots. Also, what is ignored is that animals are just as capable of emotions and emotional states as we are: When looking on Youtube for insects fighting each other, or competing or courting, it should be clear that simple creatures experience them too! When I challenge people about emotions, I suggest to them to go to Corinthians 13 - which describes the attributes of love. If you consider all those attributes, one should notice that an actual "feeling" is not required for fulfilling any of them. Therefore, the suggestion that a psychopath lacks emotions, and so he commits crimes or other pursuits outside of "normal" boundaries is far from true, especially when one considers the various records left to us from court cases and perhaps psychological evaluation - which show us that they do act out of "strong" emotions. It should be considered that a psychopath's behaviour is motivated out of negative emotions and emotional states with a distinct lack of or disregard of morality and a disregard of conscience. Psychopaths "enjoy" what they do. I am strongly suggesting to all that we are blinded by our reasoning, and by the reasoning of others. Though I do agree with the following quote mentioned before: - Dave H. wrote: From a computational standpoint, emotions represent a global state that influences a lot of other processing. Hormones etc. are basically just implementation. A sentient or sapient computer certainly could experience emotions, if it was structured in such a way as to have such global states affecting its thinking. However, his reasoning below it (that quote) is also seriously flawed. Emotions are both active and passive: They are triggered by thoughts and they trigger our thoughts; Emotions are a mental state and a behaviourial quality; Emotions react to stimuli or measure our responses to them; Emotions are independant regulators and moderators; Yet they provoke our focus and attention to specific criteria; and they help us when intuition and emotion agree or they hinder us when conscience or will clash. A computer has the same potential as us to feel emotions, but the skill of implementing emotions is much more sophisticated than the one solution fits all answer people are seeking here. Also, if anyone argues that emotions are simply "states" where a response or responses can be designed around it, really does not understand the complexity of emotions; the "freedom" emotions and thoughts have independently of each other; or what constitutes true thought! Programmers and scientists are notorious for "simulating" the real experiences of emotions or intelligence, without understanding the intimate complexities; Thinking that in finding the perfect simulation they have "discovered" the real experience. The Psi-theory seems to adequately give a proper understanding of the matter. So I would say that the simulation of emotional states "is" equivalent to experiencing emotions, but those emotional states are far more complex than what most realize. Share Improve this answer Follow edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge $\endgroup$ Add a comment | 9 $\begingroup$ There is much discussion in philosophy about inner language and the ability to perceive pain (see Pain in philosophy article). Your question is in the area of philosophy and not science. If you define emotion as some state then you can construct simple automata with two states (emotion vs no-emotion). It can be a very complicated state with degrees of truth (percentage of emotion). Basically, to mimic human emotion you need to make a living human-like organism, and still with todays understanding and technology you will not be able to recognize emotion in it. The only thing you can do is trust when it says "I'm sad". Now we are in the area of the Turing test, which is again philosophy, and not science. Share Improve this answer Follow edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge $\endgroup$ 0 Add a comment | 6 $\begingroup$ It is certainly possible for AI to theoretically feel emotion. There are, according to Murray Shanahan's book The Technological Singularity , two primary forms of AI: 1) Human based AI - achieved through processes such as whole brain emulation , the functioning of human based AI would likely be indistinguishable from that of the human brain, and, as a consequence, human based AI would likely experience emotion in the same manner as humans. - 2) AI from scratch - with this form of AI, based on machine learning algorithms and complex processes to drive goals, we enter into uncharted territory as the development of this form of AI is inherently unpredictable and unlike anything we observe in the biological sample space of intelligence we have access to. With this form of AI, there is no telling if and how it could experience emotion. As the question references the former, it is very likely that human-based AI would indeed experience emotion and other human-like characteristics. Share Improve this answer Follow answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges $\endgroup$ Add a comment | 5 $\begingroup$ Assuming an AI was built out of a mechanical husk, mirroring the human brain exactly; complete with chemical signals and all. An AI should theoretically be capable of feeling/processing emotions. Share Improve this answer Follow answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge $\endgroup$ 4 2 $\begingroup$ Are you the Apple's Siri? $\endgroup$ skrtbhtngr – skrtbhtngr 2016-11-13 09:55:24 +00:00 Commented Nov 13, 2016 at 9:55 1 $\begingroup$ Dualists would argue that even then, an AI wouldn't be able to feel emotions, see the philosophical zombie argument and the problem of Qualia . $\endgroup$ Alex S King – Alex S King 2016-11-26 07:53:16 +00:00 Commented Nov 26, 2016 at 7:53 $\begingroup$ You should differentiate the ability to process emotions and mimics. Emotions are a much bigger deal than just chemical signals. It's also a matter of perception. $\endgroup$ kvfi – kvfi 2016-11-26 12:13:48 +00:00 Commented Nov 26, 2016 at 12:13 $\begingroup$ Although I agree with @kvfi that it is also a matter of perception(which depends on lot of factors), I believe with the current pace of advancements in the field of AI, as @Siri said, an AI should be able to feel emotions by taking into account and processing every piece of data related to it. $\endgroup$ SE_User – SE_User 2016-11-28 08:28:54 +00:00 Commented Nov 28, 2016 at 8:28 Add a comment | 3 $\begingroup$ Emotions are a factor in humans having ethics/morals only because they are a factor in all human learning and decision-making. Unless you are duplicating a human being exactly, there is no reason to think that an AI will learn the way a human learns, or make decisions in the same way a human makes decisions. Therefore, whether it "feels emotion" just like we do, or whether it simply responds to outcomes "cost is greater = don't go there", the outcome of ethical BEHAVIOUR could be achieved. An AI could behave perfectly ethically without any need for feeling empathy, shame, etc. You could also argue that a lot of UNETHICAL behaviour in human beings is driven by emotions, too, and that an unemotional but ethical AI may well do a better overall job than a human being. Share Improve this answer Follow answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges $\endgroup$ 1 $\begingroup$ An unemotional AI human-duplicate doesn't have the same potential as a human being. It may be more challenging to handle emotions but they are an additional resource. $\endgroup$ user72620 – user72620 2025-08-01 14:02:33 +00:00 Commented Aug 1, 2025 at 14:02 Add a comment | 2 $\begingroup$ This question is more the province of philosophy of mind than of AI, here are some detailed answers to your question from the philosophy SE: Is simulating emotions the same as experiencing emotions? , and What is the problem with physicalism? . For the record, the accepted answer (by Siri) to the question is not entirely correct (The position in that answer corresponds roughly to John Searle's view on the question, and his is a minority view): Dualists would argue that even with a perfect replication down to the chemical level of brain interactions, an AI still wouldn't experience emotions, as it lacks the purely mental substance/properties that make a mind and not a machine. On the completely opposite side of the spectrum, functionalists would answer that such a perfect replication is overkill: even a suitably programmed digital computer can experience emotion, particularly if one equips it with higher-order and self-referential states. Share Improve this answer Follow edited Apr 13, 2017 at 12:42 Community Bot 1 answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges $\endgroup$ 1 $\begingroup$ It is definitely a philosophical question in regard to simulation of emotions, but removing anthropomorphism, and focusing on fundamental, philosophical principles, it can shown to be mathematical: ai.stackexchange.com/a/2877/1671 . To be truly felt, the emotion would have to arise out of something machines consider important, such as optimization, computing resources, calculability/intractability, and equilibria. However, it surely becomes philosophical again in terms of consciousness, or lack thereof, of the system. [Excellent answer, btw!] $\endgroup$ DukeZhou – DukeZhou 2017-03-01 16:12:46 +00:00 Commented Mar 1, 2017 at 16:12 Add a comment | 1 $\begingroup$ Well, it depends of the level of the AI. You can create an AI super autonomous with deep learning capabilities and so on, but in the robotic type only. If you'd create an AI like EVA in the Ex-Machina movie, humanoid form, deep neural transmissions and with cognitive dissonance, then it could feel. The 'AI' problem its not the chemical and neural transmissions, its the consciousness. Share Improve this answer Follow answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges $\endgroup$ Add a comment | 1 $\begingroup$ Yes and no. If you fully simulate a human brain and all of its functions, it would probably be able to feel emotions very similar to the way we do. But we don't have enough capabilities and knowledge to do that, and maybe we could find a "shortcut" - a process that is intelligent without simulating a whole brain. In this case, emotions would probably represented by data values which say "this is good (make it happen again!)", or "this is bad (avoid it!)". This is just a very basic example (there are obviously many more emotions), but it would have a similar function and the AI would have similar solutions to the ones we have. But we don't know - and probably no one ever will know - if this data value 'bad' "feels" the same way for the AI the according emotion would feel to us. Share Improve this answer Follow answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges $\endgroup$ Add a comment | 1 $\begingroup$ You first need to express emotions, you can do that without the aid of AI, and then you need someone to perceive that expression and empathize with it. If no one is there to see it, or if I am psychopath, I would probably say it doesn't have emotions. and for that, it is irrelevant/subjective. If you can empathize with characters in movies who "act" emotions, then you get my point. Share Improve this answer Follow answered Nov 27, 2016 at 1:36 user3874 $\endgroup$ Add a comment | 1 $\begingroup$ IMHO Definitely, yes! Everything that person feels (physically or mentally) can be discovered by chemical signals processing in his body or brain. If we understand the policy and nature of such signals, we can program it. There are a lot of pseudo-psychology and psychology works on this sphere, if you interested, I can suggest you: Cognitive Psychology (Robert L. Solso) describes cognitive apparat of human's mind in a simple words; The Psychology of Emotions (Carroll E. Izard) thorougly describes every kind of emotion by its looking on the human (both child and adult) face, low-level cognitive mechanism, related or adjacent emotions; Books by Paul Ekman ("Telling Lies", "Emotions Revealed", "Unmasking the Face") practical detecting of human emotions by microexpressions language on face and body. Upd 2023-01-05, to give more details A person forms their own emotions through their own experience, social norms and other factors. Every emotion has its own strength and can vary from person to person. People behave in a different ways on the similar problems. You can learn to control your emotions and not allow some of them to progress (like anger - as society teaches us to do). Need to mention that there are a lot of psychological problems that people could have like learned helplessness - that shows us that our experience can make us feel strange emotions as reaction to some triggers. And this is example that emotions like everyting else could be learned. More of that - you can control the emotions your AI will have. Of course it could be the question if AI needs emotions and what benefits it can have from them. Share Improve this answer Follow edited Feb 4, 2023 at 22:04 answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges $\endgroup$ Add a comment | 0 $\begingroup$ Could an AI feel emotions? Anthropic published an interesting blog post on it yesterday (2026-04-02), tldr is the LLM they studied does not feel emotions, but it can develop internal functional analogues of emotions in human that influence its behavior. Anthropic's research shows that large language models contain internal patterns, sometimes called “emotion vectors”, that correspond to concepts like happiness, fear, or desperation. These are not subjective experiences; they are computational states that systematically affect outputs. This aligns with a functional view of emotion: emotions are not defined by “what they feel like ” but by what they do—mapping inputs to adaptive response. For more details, see Anthropic's: blog post paper Emotion Concepts and their Function in a Large Language Model (2026-04-02). YouTube video . Share Improve this answer Follow answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges $\endgroup$ Add a comment | You must log in to answer this question. Start asking to get answers Find the answer to your question by asking. Ask question Explore related questions philosophy human-like ethics emotional-intelligence emotion-recognition See similar questions with these tags.
11 Answers 11 Sorted by: Reset to default Highest score (default) Date modified (newest first) Date created (oldest first)
11 Answers 11 Sorted by: Reset to default Highest score (default) Date modified (newest first) Date created (oldest first)
Sorted by: Reset to default Highest score (default) Date modified (newest first) Date created (oldest first)
Sorted by: Reset to default Highest score (default) Date modified (newest first) Date created (oldest first)
Sorted by: Reset to default
Highest score (default) Date modified (newest first) Date created (oldest first)
5 $\begingroup$ I have considered much of the responses here, and I would suggest that most people here have missed the point when answering the question about emotions. The problem is, scientists keep looking for a single solution as to what emotions are. This is akin to looking for a single shape that will fit all different-shaped slots. Also, what is ignored is that animals are just as capable of emotions and emotional states as we are: When looking on Youtube for insects fighting each other, or competing or courting, it should be clear that simple creatures experience them too! When I challenge people about emotions, I suggest to them to go to Corinthians 13 - which describes the attributes of love. If you consider all those attributes, one should notice that an actual "feeling" is not required for fulfilling any of them. Therefore, the suggestion that a psychopath lacks emotions, and so he commits crimes or other pursuits outside of "normal" boundaries is far from true, especially when one considers the various records left to us from court cases and perhaps psychological evaluation - which show us that they do act out of "strong" emotions. It should be considered that a psychopath's behaviour is motivated out of negative emotions and emotional states with a distinct lack of or disregard of morality and a disregard of conscience. Psychopaths "enjoy" what they do. I am strongly suggesting to all that we are blinded by our reasoning, and by the reasoning of others. Though I do agree with the following quote mentioned before: - Dave H. wrote: From a computational standpoint, emotions represent a global state that influences a lot of other processing. Hormones etc. are basically just implementation. A sentient or sapient computer certainly could experience emotions, if it was structured in such a way as to have such global states affecting its thinking. However, his reasoning below it (that quote) is also seriously flawed. Emotions are both active and passive: They are triggered by thoughts and they trigger our thoughts; Emotions are a mental state and a behaviourial quality; Emotions react to stimuli or measure our responses to them; Emotions are independant regulators and moderators; Yet they provoke our focus and attention to specific criteria; and they help us when intuition and emotion agree or they hinder us when conscience or will clash. A computer has the same potential as us to feel emotions, but the skill of implementing emotions is much more sophisticated than the one solution fits all answer people are seeking here. Also, if anyone argues that emotions are simply "states" where a response or responses can be designed around it, really does not understand the complexity of emotions; the "freedom" emotions and thoughts have independently of each other; or what constitutes true thought! Programmers and scientists are notorious for "simulating" the real experiences of emotions or intelligence, without understanding the intimate complexities; Thinking that in finding the perfect simulation they have "discovered" the real experience. The Psi-theory seems to adequately give a proper understanding of the matter. So I would say that the simulation of emotional states "is" equivalent to experiencing emotions, but those emotional states are far more complex than what most realize. Share Improve this answer Follow edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge $\endgroup$ Add a comment |
5 $\begingroup$ I have considered much of the responses here, and I would suggest that most people here have missed the point when answering the question about emotions. The problem is, scientists keep looking for a single solution as to what emotions are. This is akin to looking for a single shape that will fit all different-shaped slots. Also, what is ignored is that animals are just as capable of emotions and emotional states as we are: When looking on Youtube for insects fighting each other, or competing or courting, it should be clear that simple creatures experience them too! When I challenge people about emotions, I suggest to them to go to Corinthians 13 - which describes the attributes of love. If you consider all those attributes, one should notice that an actual "feeling" is not required for fulfilling any of them. Therefore, the suggestion that a psychopath lacks emotions, and so he commits crimes or other pursuits outside of "normal" boundaries is far from true, especially when one considers the various records left to us from court cases and perhaps psychological evaluation - which show us that they do act out of "strong" emotions. It should be considered that a psychopath's behaviour is motivated out of negative emotions and emotional states with a distinct lack of or disregard of morality and a disregard of conscience. Psychopaths "enjoy" what they do. I am strongly suggesting to all that we are blinded by our reasoning, and by the reasoning of others. Though I do agree with the following quote mentioned before: - Dave H. wrote: From a computational standpoint, emotions represent a global state that influences a lot of other processing. Hormones etc. are basically just implementation. A sentient or sapient computer certainly could experience emotions, if it was structured in such a way as to have such global states affecting its thinking. However, his reasoning below it (that quote) is also seriously flawed. Emotions are both active and passive: They are triggered by thoughts and they trigger our thoughts; Emotions are a mental state and a behaviourial quality; Emotions react to stimuli or measure our responses to them; Emotions are independant regulators and moderators; Yet they provoke our focus and attention to specific criteria; and they help us when intuition and emotion agree or they hinder us when conscience or will clash. A computer has the same potential as us to feel emotions, but the skill of implementing emotions is much more sophisticated than the one solution fits all answer people are seeking here. Also, if anyone argues that emotions are simply "states" where a response or responses can be designed around it, really does not understand the complexity of emotions; the "freedom" emotions and thoughts have independently of each other; or what constitutes true thought! Programmers and scientists are notorious for "simulating" the real experiences of emotions or intelligence, without understanding the intimate complexities; Thinking that in finding the perfect simulation they have "discovered" the real experience. The Psi-theory seems to adequately give a proper understanding of the matter. So I would say that the simulation of emotional states "is" equivalent to experiencing emotions, but those emotional states are far more complex than what most realize. Share Improve this answer Follow edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge $\endgroup$ Add a comment |
$\begingroup$ I have considered much of the responses here, and I would suggest that most people here have missed the point when answering the question about emotions. The problem is, scientists keep looking for a single solution as to what emotions are. This is akin to looking for a single shape that will fit all different-shaped slots. Also, what is ignored is that animals are just as capable of emotions and emotional states as we are: When looking on Youtube for insects fighting each other, or competing or courting, it should be clear that simple creatures experience them too! When I challenge people about emotions, I suggest to them to go to Corinthians 13 - which describes the attributes of love. If you consider all those attributes, one should notice that an actual "feeling" is not required for fulfilling any of them. Therefore, the suggestion that a psychopath lacks emotions, and so he commits crimes or other pursuits outside of "normal" boundaries is far from true, especially when one considers the various records left to us from court cases and perhaps psychological evaluation - which show us that they do act out of "strong" emotions. It should be considered that a psychopath's behaviour is motivated out of negative emotions and emotional states with a distinct lack of or disregard of morality and a disregard of conscience. Psychopaths "enjoy" what they do. I am strongly suggesting to all that we are blinded by our reasoning, and by the reasoning of others. Though I do agree with the following quote mentioned before: - Dave H. wrote: From a computational standpoint, emotions represent a global state that influences a lot of other processing. Hormones etc. are basically just implementation. A sentient or sapient computer certainly could experience emotions, if it was structured in such a way as to have such global states affecting its thinking. However, his reasoning below it (that quote) is also seriously flawed. Emotions are both active and passive: They are triggered by thoughts and they trigger our thoughts; Emotions are a mental state and a behaviourial quality; Emotions react to stimuli or measure our responses to them; Emotions are independant regulators and moderators; Yet they provoke our focus and attention to specific criteria; and they help us when intuition and emotion agree or they hinder us when conscience or will clash. A computer has the same potential as us to feel emotions, but the skill of implementing emotions is much more sophisticated than the one solution fits all answer people are seeking here. Also, if anyone argues that emotions are simply "states" where a response or responses can be designed around it, really does not understand the complexity of emotions; the "freedom" emotions and thoughts have independently of each other; or what constitutes true thought! Programmers and scientists are notorious for "simulating" the real experiences of emotions or intelligence, without understanding the intimate complexities; Thinking that in finding the perfect simulation they have "discovered" the real experience. The Psi-theory seems to adequately give a proper understanding of the matter. So I would say that the simulation of emotional states "is" equivalent to experiencing emotions, but those emotional states are far more complex than what most realize. Share Improve this answer Follow edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge $\endgroup$
I have considered much of the responses here, and I would suggest that most people here have missed the point when answering the question about emotions. The problem is, scientists keep looking for a single solution as to what emotions are. This is akin to looking for a single shape that will fit all different-shaped slots. Also, what is ignored is that animals are just as capable of emotions and emotional states as we are: When looking on Youtube for insects fighting each other, or competing or courting, it should be clear that simple creatures experience them too! When I challenge people about emotions, I suggest to them to go to Corinthians 13 - which describes the attributes of love. If you consider all those attributes, one should notice that an actual "feeling" is not required for fulfilling any of them. Therefore, the suggestion that a psychopath lacks emotions, and so he commits crimes or other pursuits outside of "normal" boundaries is far from true, especially when one considers the various records left to us from court cases and perhaps psychological evaluation - which show us that they do act out of "strong" emotions. It should be considered that a psychopath's behaviour is motivated out of negative emotions and emotional states with a distinct lack of or disregard of morality and a disregard of conscience. Psychopaths "enjoy" what they do. I am strongly suggesting to all that we are blinded by our reasoning, and by the reasoning of others. Though I do agree with the following quote mentioned before: - Dave H. wrote: From a computational standpoint, emotions represent a global state that influences a lot of other processing. Hormones etc. are basically just implementation. A sentient or sapient computer certainly could experience emotions, if it was structured in such a way as to have such global states affecting its thinking. However, his reasoning below it (that quote) is also seriously flawed. Emotions are both active and passive: They are triggered by thoughts and they trigger our thoughts; Emotions are a mental state and a behaviourial quality; Emotions react to stimuli or measure our responses to them; Emotions are independant regulators and moderators; Yet they provoke our focus and attention to specific criteria; and they help us when intuition and emotion agree or they hinder us when conscience or will clash. A computer has the same potential as us to feel emotions, but the skill of implementing emotions is much more sophisticated than the one solution fits all answer people are seeking here. Also, if anyone argues that emotions are simply "states" where a response or responses can be designed around it, really does not understand the complexity of emotions; the "freedom" emotions and thoughts have independently of each other; or what constitutes true thought! Programmers and scientists are notorious for "simulating" the real experiences of emotions or intelligence, without understanding the intimate complexities; Thinking that in finding the perfect simulation they have "discovered" the real experience. The Psi-theory seems to adequately give a proper understanding of the matter. So I would say that the simulation of emotional states "is" equivalent to experiencing emotions, but those emotional states are far more complex than what most realize.
I have considered much of the responses here, and I would suggest that most people here have missed the point when answering the question about emotions.
The problem is, scientists keep looking for a single solution as to what emotions are. This is akin to looking for a single shape that will fit all different-shaped slots.
Also, what is ignored is that animals are just as capable of emotions and emotional states as we are:
When looking on Youtube for insects fighting each other, or competing or courting, it should be clear that simple creatures experience them too!
When I challenge people about emotions, I suggest to them to go to Corinthians 13 - which describes the attributes of love. If you consider all those attributes, one should notice that an actual "feeling" is not required for fulfilling any of them.
Therefore, the suggestion that a psychopath lacks emotions, and so he commits crimes or other pursuits outside of "normal" boundaries is far from true, especially when one considers the various records left to us from court cases and perhaps psychological evaluation - which show us that they do act out of "strong" emotions.
It should be considered that a psychopath's behaviour is motivated out of negative emotions and emotional states with a distinct lack of or disregard of morality and a disregard of conscience. Psychopaths "enjoy" what they do.
I am strongly suggesting to all that we are blinded by our reasoning, and by the reasoning of others.
Though I do agree with the following quote mentioned before: -
From a computational standpoint, emotions represent a global state that influences a lot of other processing. Hormones etc. are basically just implementation. A sentient or sapient computer certainly could experience emotions, if it was structured in such a way as to have such global states affecting its thinking.
However, his reasoning below it (that quote) is also seriously flawed.
Emotions are both active and passive: They are triggered by thoughts and they trigger our thoughts; Emotions are a mental state and a behaviourial quality; Emotions react to stimuli or measure our responses to them; Emotions are independant regulators and moderators; Yet they provoke our focus and attention to specific criteria; and they help us when intuition and emotion agree or they hinder us when conscience or will clash.
A computer has the same potential as us to feel emotions, but the skill of implementing emotions is much more sophisticated than the one solution fits all answer people are seeking here.
Also, if anyone argues that emotions are simply "states" where a response or responses can be designed around it, really does not understand the complexity of emotions; the "freedom" emotions and thoughts have independently of each other; or what constitutes true thought!
Programmers and scientists are notorious for "simulating" the real experiences of emotions or intelligence, without understanding the intimate complexities; Thinking that in finding the perfect simulation they have "discovered" the real experience.
The Psi-theory seems to adequately give a proper understanding of the matter.
So I would say that the simulation of emotional states "is" equivalent to experiencing emotions, but those emotional states are far more complex than what most realize.
Share Improve this answer Follow edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge
Share Improve this answer Follow edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges
edited May 9, 2021 at 11:41 Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges
edited May 9, 2021 at 11:41
edited May 9, 2021 at 11:41
Shayan Shafiq 350 1 1 gold badge 4 4 silver badges 12 12 bronze badges
350 1 1 gold badge 4 4 silver badges 12 12 bronze badges
answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge
answered Nov 23, 2016 at 19:39 Engage 66 1 1 bronze badge
answered Nov 23, 2016 at 19:39
answered Nov 23, 2016 at 19:39
Engage 66 1 1 bronze badge
9 $\begingroup$ There is much discussion in philosophy about inner language and the ability to perceive pain (see Pain in philosophy article). Your question is in the area of philosophy and not science. If you define emotion as some state then you can construct simple automata with two states (emotion vs no-emotion). It can be a very complicated state with degrees of truth (percentage of emotion). Basically, to mimic human emotion you need to make a living human-like organism, and still with todays understanding and technology you will not be able to recognize emotion in it. The only thing you can do is trust when it says "I'm sad". Now we are in the area of the Turing test, which is again philosophy, and not science. Share Improve this answer Follow edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge $\endgroup$ 0 Add a comment |
9 $\begingroup$ There is much discussion in philosophy about inner language and the ability to perceive pain (see Pain in philosophy article). Your question is in the area of philosophy and not science. If you define emotion as some state then you can construct simple automata with two states (emotion vs no-emotion). It can be a very complicated state with degrees of truth (percentage of emotion). Basically, to mimic human emotion you need to make a living human-like organism, and still with todays understanding and technology you will not be able to recognize emotion in it. The only thing you can do is trust when it says "I'm sad". Now we are in the area of the Turing test, which is again philosophy, and not science. Share Improve this answer Follow edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge $\endgroup$ 0 Add a comment |
$\begingroup$ There is much discussion in philosophy about inner language and the ability to perceive pain (see Pain in philosophy article). Your question is in the area of philosophy and not science. If you define emotion as some state then you can construct simple automata with two states (emotion vs no-emotion). It can be a very complicated state with degrees of truth (percentage of emotion). Basically, to mimic human emotion you need to make a living human-like organism, and still with todays understanding and technology you will not be able to recognize emotion in it. The only thing you can do is trust when it says "I'm sad". Now we are in the area of the Turing test, which is again philosophy, and not science. Share Improve this answer Follow edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge $\endgroup$
There is much discussion in philosophy about inner language and the ability to perceive pain (see Pain in philosophy article). Your question is in the area of philosophy and not science. If you define emotion as some state then you can construct simple automata with two states (emotion vs no-emotion). It can be a very complicated state with degrees of truth (percentage of emotion). Basically, to mimic human emotion you need to make a living human-like organism, and still with todays understanding and technology you will not be able to recognize emotion in it. The only thing you can do is trust when it says "I'm sad". Now we are in the area of the Turing test, which is again philosophy, and not science.
There is much discussion in philosophy about inner language and the ability to perceive pain (see Pain in philosophy article). Your question is in the area of philosophy and not science. If you define emotion as some state then you can construct simple automata with two states (emotion vs no-emotion). It can be a very complicated state with degrees of truth (percentage of emotion).
Basically, to mimic human emotion you need to make a living human-like organism, and still with todays understanding and technology you will not be able to recognize emotion in it. The only thing you can do is trust when it says "I'm sad". Now we are in the area of the Turing test, which is again philosophy, and not science.
Share Improve this answer Follow edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge
Share Improve this answer Follow edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges
edited Dec 9, 2016 at 17:52 Christian Westbrook 332 1 1 silver badge 8 8 bronze badges
edited Dec 9, 2016 at 17:52
edited Dec 9, 2016 at 17:52
Christian Westbrook 332 1 1 silver badge 8 8 bronze badges
332 1 1 silver badge 8 8 bronze badges
answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge
answered Nov 8, 2016 at 9:33 aristotelo_ver2 99 1 1 bronze badge
answered Nov 8, 2016 at 9:33
answered Nov 8, 2016 at 9:33
aristotelo_ver2 99 1 1 bronze badge
6 $\begingroup$ It is certainly possible for AI to theoretically feel emotion. There are, according to Murray Shanahan's book The Technological Singularity , two primary forms of AI: 1) Human based AI - achieved through processes such as whole brain emulation , the functioning of human based AI would likely be indistinguishable from that of the human brain, and, as a consequence, human based AI would likely experience emotion in the same manner as humans. - 2) AI from scratch - with this form of AI, based on machine learning algorithms and complex processes to drive goals, we enter into uncharted territory as the development of this form of AI is inherently unpredictable and unlike anything we observe in the biological sample space of intelligence we have access to. With this form of AI, there is no telling if and how it could experience emotion. As the question references the former, it is very likely that human-based AI would indeed experience emotion and other human-like characteristics. Share Improve this answer Follow answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges $\endgroup$ Add a comment |
6 $\begingroup$ It is certainly possible for AI to theoretically feel emotion. There are, according to Murray Shanahan's book The Technological Singularity , two primary forms of AI: 1) Human based AI - achieved through processes such as whole brain emulation , the functioning of human based AI would likely be indistinguishable from that of the human brain, and, as a consequence, human based AI would likely experience emotion in the same manner as humans. - 2) AI from scratch - with this form of AI, based on machine learning algorithms and complex processes to drive goals, we enter into uncharted territory as the development of this form of AI is inherently unpredictable and unlike anything we observe in the biological sample space of intelligence we have access to. With this form of AI, there is no telling if and how it could experience emotion. As the question references the former, it is very likely that human-based AI would indeed experience emotion and other human-like characteristics. Share Improve this answer Follow answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges $\endgroup$ Add a comment |
$\begingroup$ It is certainly possible for AI to theoretically feel emotion. There are, according to Murray Shanahan's book The Technological Singularity , two primary forms of AI: 1) Human based AI - achieved through processes such as whole brain emulation , the functioning of human based AI would likely be indistinguishable from that of the human brain, and, as a consequence, human based AI would likely experience emotion in the same manner as humans. - 2) AI from scratch - with this form of AI, based on machine learning algorithms and complex processes to drive goals, we enter into uncharted territory as the development of this form of AI is inherently unpredictable and unlike anything we observe in the biological sample space of intelligence we have access to. With this form of AI, there is no telling if and how it could experience emotion. As the question references the former, it is very likely that human-based AI would indeed experience emotion and other human-like characteristics. Share Improve this answer Follow answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges $\endgroup$
It is certainly possible for AI to theoretically feel emotion. There are, according to Murray Shanahan's book The Technological Singularity , two primary forms of AI: 1) Human based AI - achieved through processes such as whole brain emulation , the functioning of human based AI would likely be indistinguishable from that of the human brain, and, as a consequence, human based AI would likely experience emotion in the same manner as humans. - 2) AI from scratch - with this form of AI, based on machine learning algorithms and complex processes to drive goals, we enter into uncharted territory as the development of this form of AI is inherently unpredictable and unlike anything we observe in the biological sample space of intelligence we have access to. With this form of AI, there is no telling if and how it could experience emotion. As the question references the former, it is very likely that human-based AI would indeed experience emotion and other human-like characteristics.
It is certainly possible for AI to theoretically feel emotion.
There are, according to Murray Shanahan's book The Technological Singularity , two primary forms of AI:
1) Human based AI - achieved through processes such as whole brain emulation , the functioning of human based AI would likely be indistinguishable from that of the human brain, and, as a consequence, human based AI would likely experience emotion in the same manner as humans.
2) AI from scratch - with this form of AI, based on machine learning algorithms and complex processes to drive goals, we enter into uncharted territory as the development of this form of AI is inherently unpredictable and unlike anything we observe in the biological sample space of intelligence we have access to.
With this form of AI, there is no telling if and how it could experience emotion.
As the question references the former, it is very likely that human-based AI would indeed experience emotion and other human-like characteristics.
Share Improve this answer Follow answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges
Share Improve this answer Follow answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges
answered Nov 20, 2016 at 12:50 GJZ 472 2 2 silver badges 16 16 bronze badges
answered Nov 20, 2016 at 12:50
answered Nov 20, 2016 at 12:50
GJZ 472 2 2 silver badges 16 16 bronze badges
472 2 2 silver badges 16 16 bronze badges
5 $\begingroup$ Assuming an AI was built out of a mechanical husk, mirroring the human brain exactly; complete with chemical signals and all. An AI should theoretically be capable of feeling/processing emotions. Share Improve this answer Follow answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge $\endgroup$ 4 2 $\begingroup$ Are you the Apple's Siri? $\endgroup$ skrtbhtngr – skrtbhtngr 2016-11-13 09:55:24 +00:00 Commented Nov 13, 2016 at 9:55 1 $\begingroup$ Dualists would argue that even then, an AI wouldn't be able to feel emotions, see the philosophical zombie argument and the problem of Qualia . $\endgroup$ Alex S King – Alex S King 2016-11-26 07:53:16 +00:00 Commented Nov 26, 2016 at 7:53 $\begingroup$ You should differentiate the ability to process emotions and mimics. Emotions are a much bigger deal than just chemical signals. It's also a matter of perception. $\endgroup$ kvfi – kvfi 2016-11-26 12:13:48 +00:00 Commented Nov 26, 2016 at 12:13 $\begingroup$ Although I agree with @kvfi that it is also a matter of perception(which depends on lot of factors), I believe with the current pace of advancements in the field of AI, as @Siri said, an AI should be able to feel emotions by taking into account and processing every piece of data related to it. $\endgroup$ SE_User – SE_User 2016-11-28 08:28:54 +00:00 Commented Nov 28, 2016 at 8:28 Add a comment |
5 $\begingroup$ Assuming an AI was built out of a mechanical husk, mirroring the human brain exactly; complete with chemical signals and all. An AI should theoretically be capable of feeling/processing emotions. Share Improve this answer Follow answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge $\endgroup$ 4 2 $\begingroup$ Are you the Apple's Siri? $\endgroup$ skrtbhtngr – skrtbhtngr 2016-11-13 09:55:24 +00:00 Commented Nov 13, 2016 at 9:55 1 $\begingroup$ Dualists would argue that even then, an AI wouldn't be able to feel emotions, see the philosophical zombie argument and the problem of Qualia . $\endgroup$ Alex S King – Alex S King 2016-11-26 07:53:16 +00:00 Commented Nov 26, 2016 at 7:53 $\begingroup$ You should differentiate the ability to process emotions and mimics. Emotions are a much bigger deal than just chemical signals. It's also a matter of perception. $\endgroup$ kvfi – kvfi 2016-11-26 12:13:48 +00:00 Commented Nov 26, 2016 at 12:13 $\begingroup$ Although I agree with @kvfi that it is also a matter of perception(which depends on lot of factors), I believe with the current pace of advancements in the field of AI, as @Siri said, an AI should be able to feel emotions by taking into account and processing every piece of data related to it. $\endgroup$ SE_User – SE_User 2016-11-28 08:28:54 +00:00 Commented Nov 28, 2016 at 8:28 Add a comment |
$\begingroup$ Assuming an AI was built out of a mechanical husk, mirroring the human brain exactly; complete with chemical signals and all. An AI should theoretically be capable of feeling/processing emotions. Share Improve this answer Follow answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge $\endgroup$
Assuming an AI was built out of a mechanical husk, mirroring the human brain exactly; complete with chemical signals and all. An AI should theoretically be capable of feeling/processing emotions.
Assuming an AI was built out of a mechanical husk, mirroring the human brain exactly; complete with chemical signals and all. An AI should theoretically be capable of feeling/processing emotions.
Share Improve this answer Follow answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge
Share Improve this answer Follow answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge
answered Nov 6, 2016 at 4:52 Siri 59 1 1 bronze badge
answered Nov 6, 2016 at 4:52
answered Nov 6, 2016 at 4:52
Siri 59 1 1 bronze badge
2 $\begingroup$ Are you the Apple's Siri? $\endgroup$ skrtbhtngr – skrtbhtngr 2016-11-13 09:55:24 +00:00 Commented Nov 13, 2016 at 9:55 1 $\begingroup$ Dualists would argue that even then, an AI wouldn't be able to feel emotions, see the philosophical zombie argument and the problem of Qualia . $\endgroup$ Alex S King – Alex S King 2016-11-26 07:53:16 +00:00 Commented Nov 26, 2016 at 7:53 $\begingroup$ You should differentiate the ability to process emotions and mimics. Emotions are a much bigger deal than just chemical signals. It's also a matter of perception. $\endgroup$ kvfi – kvfi 2016-11-26 12:13:48 +00:00 Commented Nov 26, 2016 at 12:13 $\begingroup$ Although I agree with @kvfi that it is also a matter of perception(which depends on lot of factors), I believe with the current pace of advancements in the field of AI, as @Siri said, an AI should be able to feel emotions by taking into account and processing every piece of data related to it. $\endgroup$ SE_User – SE_User 2016-11-28 08:28:54 +00:00 Commented Nov 28, 2016 at 8:28 Add a comment |
2 $\begingroup$ Are you the Apple's Siri? $\endgroup$ skrtbhtngr – skrtbhtngr 2016-11-13 09:55:24 +00:00 Commented Nov 13, 2016 at 9:55 1 $\begingroup$ Dualists would argue that even then, an AI wouldn't be able to feel emotions, see the philosophical zombie argument and the problem of Qualia . $\endgroup$ Alex S King – Alex S King 2016-11-26 07:53:16 +00:00 Commented Nov 26, 2016 at 7:53 $\begingroup$ You should differentiate the ability to process emotions and mimics. Emotions are a much bigger deal than just chemical signals. It's also a matter of perception. $\endgroup$ kvfi – kvfi 2016-11-26 12:13:48 +00:00 Commented Nov 26, 2016 at 12:13 $\begingroup$ Although I agree with @kvfi that it is also a matter of perception(which depends on lot of factors), I believe with the current pace of advancements in the field of AI, as @Siri said, an AI should be able to feel emotions by taking into account and processing every piece of data related to it. $\endgroup$ SE_User – SE_User 2016-11-28 08:28:54 +00:00 Commented Nov 28, 2016 at 8:28
$\begingroup$ Are you the Apple's Siri? $\endgroup$ skrtbhtngr – skrtbhtngr 2016-11-13 09:55:24 +00:00 Commented Nov 13, 2016 at 9:55
$\begingroup$ Are you the Apple's Siri? $\endgroup$ skrtbhtngr – skrtbhtngr 2016-11-13 09:55:24 +00:00 Commented Nov 13, 2016 at 9:55
skrtbhtngr – skrtbhtngr
2016-11-13 09:55:24 +00:00
$\begingroup$ Dualists would argue that even then, an AI wouldn't be able to feel emotions, see the philosophical zombie argument and the problem of Qualia . $\endgroup$ Alex S King – Alex S King 2016-11-26 07:53:16 +00:00 Commented Nov 26, 2016 at 7:53
$\begingroup$ Dualists would argue that even then, an AI wouldn't be able to feel emotions, see the philosophical zombie argument and the problem of Qualia . $\endgroup$ Alex S King – Alex S King 2016-11-26 07:53:16 +00:00 Commented Nov 26, 2016 at 7:53
Alex S King – Alex S King
2016-11-26 07:53:16 +00:00
$\begingroup$ You should differentiate the ability to process emotions and mimics. Emotions are a much bigger deal than just chemical signals. It's also a matter of perception. $\endgroup$ kvfi – kvfi 2016-11-26 12:13:48 +00:00 Commented Nov 26, 2016 at 12:13
$\begingroup$ You should differentiate the ability to process emotions and mimics. Emotions are a much bigger deal than just chemical signals. It's also a matter of perception. $\endgroup$ kvfi – kvfi 2016-11-26 12:13:48 +00:00 Commented Nov 26, 2016 at 12:13
2016-11-26 12:13:48 +00:00
$\begingroup$ Although I agree with @kvfi that it is also a matter of perception(which depends on lot of factors), I believe with the current pace of advancements in the field of AI, as @Siri said, an AI should be able to feel emotions by taking into account and processing every piece of data related to it. $\endgroup$ SE_User – SE_User 2016-11-28 08:28:54 +00:00 Commented Nov 28, 2016 at 8:28
$\begingroup$ Although I agree with @kvfi that it is also a matter of perception(which depends on lot of factors), I believe with the current pace of advancements in the field of AI, as @Siri said, an AI should be able to feel emotions by taking into account and processing every piece of data related to it. $\endgroup$ SE_User – SE_User 2016-11-28 08:28:54 +00:00 Commented Nov 28, 2016 at 8:28
2016-11-28 08:28:54 +00:00
3 $\begingroup$ Emotions are a factor in humans having ethics/morals only because they are a factor in all human learning and decision-making. Unless you are duplicating a human being exactly, there is no reason to think that an AI will learn the way a human learns, or make decisions in the same way a human makes decisions. Therefore, whether it "feels emotion" just like we do, or whether it simply responds to outcomes "cost is greater = don't go there", the outcome of ethical BEHAVIOUR could be achieved. An AI could behave perfectly ethically without any need for feeling empathy, shame, etc. You could also argue that a lot of UNETHICAL behaviour in human beings is driven by emotions, too, and that an unemotional but ethical AI may well do a better overall job than a human being. Share Improve this answer Follow answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges $\endgroup$ 1 $\begingroup$ An unemotional AI human-duplicate doesn't have the same potential as a human being. It may be more challenging to handle emotions but they are an additional resource. $\endgroup$ user72620 – user72620 2025-08-01 14:02:33 +00:00 Commented Aug 1, 2025 at 14:02 Add a comment |
3 $\begingroup$ Emotions are a factor in humans having ethics/morals only because they are a factor in all human learning and decision-making. Unless you are duplicating a human being exactly, there is no reason to think that an AI will learn the way a human learns, or make decisions in the same way a human makes decisions. Therefore, whether it "feels emotion" just like we do, or whether it simply responds to outcomes "cost is greater = don't go there", the outcome of ethical BEHAVIOUR could be achieved. An AI could behave perfectly ethically without any need for feeling empathy, shame, etc. You could also argue that a lot of UNETHICAL behaviour in human beings is driven by emotions, too, and that an unemotional but ethical AI may well do a better overall job than a human being. Share Improve this answer Follow answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges $\endgroup$ 1 $\begingroup$ An unemotional AI human-duplicate doesn't have the same potential as a human being. It may be more challenging to handle emotions but they are an additional resource. $\endgroup$ user72620 – user72620 2025-08-01 14:02:33 +00:00 Commented Aug 1, 2025 at 14:02 Add a comment |
$\begingroup$ Emotions are a factor in humans having ethics/morals only because they are a factor in all human learning and decision-making. Unless you are duplicating a human being exactly, there is no reason to think that an AI will learn the way a human learns, or make decisions in the same way a human makes decisions. Therefore, whether it "feels emotion" just like we do, or whether it simply responds to outcomes "cost is greater = don't go there", the outcome of ethical BEHAVIOUR could be achieved. An AI could behave perfectly ethically without any need for feeling empathy, shame, etc. You could also argue that a lot of UNETHICAL behaviour in human beings is driven by emotions, too, and that an unemotional but ethical AI may well do a better overall job than a human being. Share Improve this answer Follow answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges $\endgroup$
Emotions are a factor in humans having ethics/morals only because they are a factor in all human learning and decision-making. Unless you are duplicating a human being exactly, there is no reason to think that an AI will learn the way a human learns, or make decisions in the same way a human makes decisions. Therefore, whether it "feels emotion" just like we do, or whether it simply responds to outcomes "cost is greater = don't go there", the outcome of ethical BEHAVIOUR could be achieved. An AI could behave perfectly ethically without any need for feeling empathy, shame, etc. You could also argue that a lot of UNETHICAL behaviour in human beings is driven by emotions, too, and that an unemotional but ethical AI may well do a better overall job than a human being.
Emotions are a factor in humans having ethics/morals only because they are a factor in all human learning and decision-making.
Unless you are duplicating a human being exactly, there is no reason to think that an AI will learn the way a human learns, or make decisions in the same way a human makes decisions.
Therefore, whether it "feels emotion" just like we do, or whether it simply responds to outcomes "cost is greater = don't go there", the outcome of ethical BEHAVIOUR could be achieved. An AI could behave perfectly ethically without any need for feeling empathy, shame, etc.
You could also argue that a lot of UNETHICAL behaviour in human beings is driven by emotions, too, and that an unemotional but ethical AI may well do a better overall job than a human being.
Share Improve this answer Follow answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges
Share Improve this answer Follow answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges
answered Nov 20, 2016 at 6:00 Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges
answered Nov 20, 2016 at 6:00
answered Nov 20, 2016 at 6:00
Jnani Jenny Hale 511 2 2 silver badges 11 11 bronze badges
511 2 2 silver badges 11 11 bronze badges
$\begingroup$ An unemotional AI human-duplicate doesn't have the same potential as a human being. It may be more challenging to handle emotions but they are an additional resource. $\endgroup$ user72620 – user72620 2025-08-01 14:02:33 +00:00 Commented Aug 1, 2025 at 14:02 Add a comment |
$\begingroup$ An unemotional AI human-duplicate doesn't have the same potential as a human being. It may be more challenging to handle emotions but they are an additional resource. $\endgroup$ user72620 – user72620 2025-08-01 14:02:33 +00:00 Commented Aug 1, 2025 at 14:02
$\begingroup$ An unemotional AI human-duplicate doesn't have the same potential as a human being. It may be more challenging to handle emotions but they are an additional resource. $\endgroup$ user72620 – user72620 2025-08-01 14:02:33 +00:00 Commented Aug 1, 2025 at 14:02
$\begingroup$ An unemotional AI human-duplicate doesn't have the same potential as a human being. It may be more challenging to handle emotions but they are an additional resource. $\endgroup$ user72620 – user72620 2025-08-01 14:02:33 +00:00 Commented Aug 1, 2025 at 14:02
user72620 – user72620
2025-08-01 14:02:33 +00:00
2 $\begingroup$ This question is more the province of philosophy of mind than of AI, here are some detailed answers to your question from the philosophy SE: Is simulating emotions the same as experiencing emotions? , and What is the problem with physicalism? . For the record, the accepted answer (by Siri) to the question is not entirely correct (The position in that answer corresponds roughly to John Searle's view on the question, and his is a minority view): Dualists would argue that even with a perfect replication down to the chemical level of brain interactions, an AI still wouldn't experience emotions, as it lacks the purely mental substance/properties that make a mind and not a machine. On the completely opposite side of the spectrum, functionalists would answer that such a perfect replication is overkill: even a suitably programmed digital computer can experience emotion, particularly if one equips it with higher-order and self-referential states. Share Improve this answer Follow edited Apr 13, 2017 at 12:42 Community Bot 1 answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges $\endgroup$ 1 $\begingroup$ It is definitely a philosophical question in regard to simulation of emotions, but removing anthropomorphism, and focusing on fundamental, philosophical principles, it can shown to be mathematical: ai.stackexchange.com/a/2877/1671 . To be truly felt, the emotion would have to arise out of something machines consider important, such as optimization, computing resources, calculability/intractability, and equilibria. However, it surely becomes philosophical again in terms of consciousness, or lack thereof, of the system. [Excellent answer, btw!] $\endgroup$ DukeZhou – DukeZhou 2017-03-01 16:12:46 +00:00 Commented Mar 1, 2017 at 16:12 Add a comment |
2 $\begingroup$ This question is more the province of philosophy of mind than of AI, here are some detailed answers to your question from the philosophy SE: Is simulating emotions the same as experiencing emotions? , and What is the problem with physicalism? . For the record, the accepted answer (by Siri) to the question is not entirely correct (The position in that answer corresponds roughly to John Searle's view on the question, and his is a minority view): Dualists would argue that even with a perfect replication down to the chemical level of brain interactions, an AI still wouldn't experience emotions, as it lacks the purely mental substance/properties that make a mind and not a machine. On the completely opposite side of the spectrum, functionalists would answer that such a perfect replication is overkill: even a suitably programmed digital computer can experience emotion, particularly if one equips it with higher-order and self-referential states. Share Improve this answer Follow edited Apr 13, 2017 at 12:42 Community Bot 1 answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges $\endgroup$ 1 $\begingroup$ It is definitely a philosophical question in regard to simulation of emotions, but removing anthropomorphism, and focusing on fundamental, philosophical principles, it can shown to be mathematical: ai.stackexchange.com/a/2877/1671 . To be truly felt, the emotion would have to arise out of something machines consider important, such as optimization, computing resources, calculability/intractability, and equilibria. However, it surely becomes philosophical again in terms of consciousness, or lack thereof, of the system. [Excellent answer, btw!] $\endgroup$ DukeZhou – DukeZhou 2017-03-01 16:12:46 +00:00 Commented Mar 1, 2017 at 16:12 Add a comment |
$\begingroup$ This question is more the province of philosophy of mind than of AI, here are some detailed answers to your question from the philosophy SE: Is simulating emotions the same as experiencing emotions? , and What is the problem with physicalism? . For the record, the accepted answer (by Siri) to the question is not entirely correct (The position in that answer corresponds roughly to John Searle's view on the question, and his is a minority view): Dualists would argue that even with a perfect replication down to the chemical level of brain interactions, an AI still wouldn't experience emotions, as it lacks the purely mental substance/properties that make a mind and not a machine. On the completely opposite side of the spectrum, functionalists would answer that such a perfect replication is overkill: even a suitably programmed digital computer can experience emotion, particularly if one equips it with higher-order and self-referential states. Share Improve this answer Follow edited Apr 13, 2017 at 12:42 Community Bot 1 answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges $\endgroup$
This question is more the province of philosophy of mind than of AI, here are some detailed answers to your question from the philosophy SE: Is simulating emotions the same as experiencing emotions? , and What is the problem with physicalism? . For the record, the accepted answer (by Siri) to the question is not entirely correct (The position in that answer corresponds roughly to John Searle's view on the question, and his is a minority view): Dualists would argue that even with a perfect replication down to the chemical level of brain interactions, an AI still wouldn't experience emotions, as it lacks the purely mental substance/properties that make a mind and not a machine. On the completely opposite side of the spectrum, functionalists would answer that such a perfect replication is overkill: even a suitably programmed digital computer can experience emotion, particularly if one equips it with higher-order and self-referential states.
This question is more the province of philosophy of mind than of AI, here are some detailed answers to your question from the philosophy SE: Is simulating emotions the same as experiencing emotions? , and What is the problem with physicalism? .
For the record, the accepted answer (by Siri) to the question is not entirely correct (The position in that answer corresponds roughly to John Searle's view on the question, and his is a minority view): Dualists would argue that even with a perfect replication down to the chemical level of brain interactions, an AI still wouldn't experience emotions, as it lacks the purely mental substance/properties that make a mind and not a machine.
On the completely opposite side of the spectrum, functionalists would answer that such a perfect replication is overkill: even a suitably programmed digital computer can experience emotion, particularly if one equips it with higher-order and self-referential states.
Share Improve this answer Follow edited Apr 13, 2017 at 12:42 Community Bot 1 answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges
Share Improve this answer Follow edited Apr 13, 2017 at 12:42 Community Bot 1 answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
edited Apr 13, 2017 at 12:42 Community Bot 1
edited Apr 13, 2017 at 12:42 Community Bot 1
edited Apr 13, 2017 at 12:42
edited Apr 13, 2017 at 12:42
answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges
answered Nov 26, 2016 at 8:05 Alex S King 251 1 1 silver badge 6 6 bronze badges
answered Nov 26, 2016 at 8:05
answered Nov 26, 2016 at 8:05
Alex S King 251 1 1 silver badge 6 6 bronze badges
251 1 1 silver badge 6 6 bronze badges
$\begingroup$ It is definitely a philosophical question in regard to simulation of emotions, but removing anthropomorphism, and focusing on fundamental, philosophical principles, it can shown to be mathematical: ai.stackexchange.com/a/2877/1671 . To be truly felt, the emotion would have to arise out of something machines consider important, such as optimization, computing resources, calculability/intractability, and equilibria. However, it surely becomes philosophical again in terms of consciousness, or lack thereof, of the system. [Excellent answer, btw!] $\endgroup$ DukeZhou – DukeZhou 2017-03-01 16:12:46 +00:00 Commented Mar 1, 2017 at 16:12 Add a comment |
$\begingroup$ It is definitely a philosophical question in regard to simulation of emotions, but removing anthropomorphism, and focusing on fundamental, philosophical principles, it can shown to be mathematical: ai.stackexchange.com/a/2877/1671 . To be truly felt, the emotion would have to arise out of something machines consider important, such as optimization, computing resources, calculability/intractability, and equilibria. However, it surely becomes philosophical again in terms of consciousness, or lack thereof, of the system. [Excellent answer, btw!] $\endgroup$ DukeZhou – DukeZhou 2017-03-01 16:12:46 +00:00 Commented Mar 1, 2017 at 16:12
$\begingroup$ It is definitely a philosophical question in regard to simulation of emotions, but removing anthropomorphism, and focusing on fundamental, philosophical principles, it can shown to be mathematical: ai.stackexchange.com/a/2877/1671 . To be truly felt, the emotion would have to arise out of something machines consider important, such as optimization, computing resources, calculability/intractability, and equilibria. However, it surely becomes philosophical again in terms of consciousness, or lack thereof, of the system. [Excellent answer, btw!] $\endgroup$ DukeZhou – DukeZhou 2017-03-01 16:12:46 +00:00 Commented Mar 1, 2017 at 16:12
$\begingroup$ It is definitely a philosophical question in regard to simulation of emotions, but removing anthropomorphism, and focusing on fundamental, philosophical principles, it can shown to be mathematical: ai.stackexchange.com/a/2877/1671 . To be truly felt, the emotion would have to arise out of something machines consider important, such as optimization, computing resources, calculability/intractability, and equilibria. However, it surely becomes philosophical again in terms of consciousness, or lack thereof, of the system. [Excellent answer, btw!] $\endgroup$ DukeZhou – DukeZhou 2017-03-01 16:12:46 +00:00 Commented Mar 1, 2017 at 16:12
2017-03-01 16:12:46 +00:00
1 $\begingroup$ Well, it depends of the level of the AI. You can create an AI super autonomous with deep learning capabilities and so on, but in the robotic type only. If you'd create an AI like EVA in the Ex-Machina movie, humanoid form, deep neural transmissions and with cognitive dissonance, then it could feel. The 'AI' problem its not the chemical and neural transmissions, its the consciousness. Share Improve this answer Follow answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges $\endgroup$ Add a comment |
1 $\begingroup$ Well, it depends of the level of the AI. You can create an AI super autonomous with deep learning capabilities and so on, but in the robotic type only. If you'd create an AI like EVA in the Ex-Machina movie, humanoid form, deep neural transmissions and with cognitive dissonance, then it could feel. The 'AI' problem its not the chemical and neural transmissions, its the consciousness. Share Improve this answer Follow answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges $\endgroup$ Add a comment |
$\begingroup$ Well, it depends of the level of the AI. You can create an AI super autonomous with deep learning capabilities and so on, but in the robotic type only. If you'd create an AI like EVA in the Ex-Machina movie, humanoid form, deep neural transmissions and with cognitive dissonance, then it could feel. The 'AI' problem its not the chemical and neural transmissions, its the consciousness. Share Improve this answer Follow answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges $\endgroup$
Well, it depends of the level of the AI. You can create an AI super autonomous with deep learning capabilities and so on, but in the robotic type only. If you'd create an AI like EVA in the Ex-Machina movie, humanoid form, deep neural transmissions and with cognitive dissonance, then it could feel. The 'AI' problem its not the chemical and neural transmissions, its the consciousness.
Well, it depends of the level of the AI.
You can create an AI super autonomous with deep learning capabilities and so on, but in the robotic type only.
If you'd create an AI like EVA in the Ex-Machina movie, humanoid form, deep neural transmissions and with cognitive dissonance, then it could feel.
The 'AI' problem its not the chemical and neural transmissions, its the consciousness.
Share Improve this answer Follow answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges
Share Improve this answer Follow answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges
answered Nov 6, 2016 at 14:29 Ivan Cantarino 111 3 3 bronze badges
answered Nov 6, 2016 at 14:29
answered Nov 6, 2016 at 14:29
Ivan Cantarino 111 3 3 bronze badges
111 3 3 bronze badges
1 $\begingroup$ Yes and no. If you fully simulate a human brain and all of its functions, it would probably be able to feel emotions very similar to the way we do. But we don't have enough capabilities and knowledge to do that, and maybe we could find a "shortcut" - a process that is intelligent without simulating a whole brain. In this case, emotions would probably represented by data values which say "this is good (make it happen again!)", or "this is bad (avoid it!)". This is just a very basic example (there are obviously many more emotions), but it would have a similar function and the AI would have similar solutions to the ones we have. But we don't know - and probably no one ever will know - if this data value 'bad' "feels" the same way for the AI the according emotion would feel to us. Share Improve this answer Follow answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges $\endgroup$ Add a comment |
1 $\begingroup$ Yes and no. If you fully simulate a human brain and all of its functions, it would probably be able to feel emotions very similar to the way we do. But we don't have enough capabilities and knowledge to do that, and maybe we could find a "shortcut" - a process that is intelligent without simulating a whole brain. In this case, emotions would probably represented by data values which say "this is good (make it happen again!)", or "this is bad (avoid it!)". This is just a very basic example (there are obviously many more emotions), but it would have a similar function and the AI would have similar solutions to the ones we have. But we don't know - and probably no one ever will know - if this data value 'bad' "feels" the same way for the AI the according emotion would feel to us. Share Improve this answer Follow answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges $\endgroup$ Add a comment |
$\begingroup$ Yes and no. If you fully simulate a human brain and all of its functions, it would probably be able to feel emotions very similar to the way we do. But we don't have enough capabilities and knowledge to do that, and maybe we could find a "shortcut" - a process that is intelligent without simulating a whole brain. In this case, emotions would probably represented by data values which say "this is good (make it happen again!)", or "this is bad (avoid it!)". This is just a very basic example (there are obviously many more emotions), but it would have a similar function and the AI would have similar solutions to the ones we have. But we don't know - and probably no one ever will know - if this data value 'bad' "feels" the same way for the AI the according emotion would feel to us. Share Improve this answer Follow answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges $\endgroup$
Yes and no. If you fully simulate a human brain and all of its functions, it would probably be able to feel emotions very similar to the way we do. But we don't have enough capabilities and knowledge to do that, and maybe we could find a "shortcut" - a process that is intelligent without simulating a whole brain. In this case, emotions would probably represented by data values which say "this is good (make it happen again!)", or "this is bad (avoid it!)". This is just a very basic example (there are obviously many more emotions), but it would have a similar function and the AI would have similar solutions to the ones we have. But we don't know - and probably no one ever will know - if this data value 'bad' "feels" the same way for the AI the according emotion would feel to us.
Yes and no. If you fully simulate a human brain and all of its functions, it would probably be able to feel emotions very similar to the way we do.
But we don't have enough capabilities and knowledge to do that, and maybe we could find a "shortcut" - a process that is intelligent without simulating a whole brain. In this case, emotions would probably represented by data values which say "this is good (make it happen again!)", or "this is bad (avoid it!)". This is just a very basic example (there are obviously many more emotions), but it would have a similar function and the AI would have similar solutions to the ones we have. But we don't know - and probably no one ever will know - if this data value 'bad' "feels" the same way for the AI the according emotion would feel to us.
Share Improve this answer Follow answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges
Share Improve this answer Follow answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges
answered Nov 9, 2016 at 11:07 Namnodorel 164 4 4 bronze badges
answered Nov 9, 2016 at 11:07
answered Nov 9, 2016 at 11:07
Namnodorel 164 4 4 bronze badges
164 4 4 bronze badges
1 $\begingroup$ You first need to express emotions, you can do that without the aid of AI, and then you need someone to perceive that expression and empathize with it. If no one is there to see it, or if I am psychopath, I would probably say it doesn't have emotions. and for that, it is irrelevant/subjective. If you can empathize with characters in movies who "act" emotions, then you get my point. Share Improve this answer Follow answered Nov 27, 2016 at 1:36 user3874 $\endgroup$ Add a comment |
1 $\begingroup$ You first need to express emotions, you can do that without the aid of AI, and then you need someone to perceive that expression and empathize with it. If no one is there to see it, or if I am psychopath, I would probably say it doesn't have emotions. and for that, it is irrelevant/subjective. If you can empathize with characters in movies who "act" emotions, then you get my point. Share Improve this answer Follow answered Nov 27, 2016 at 1:36 user3874 $\endgroup$ Add a comment |
$\begingroup$ You first need to express emotions, you can do that without the aid of AI, and then you need someone to perceive that expression and empathize with it. If no one is there to see it, or if I am psychopath, I would probably say it doesn't have emotions. and for that, it is irrelevant/subjective. If you can empathize with characters in movies who "act" emotions, then you get my point. Share Improve this answer Follow answered Nov 27, 2016 at 1:36 user3874 $\endgroup$
You first need to express emotions, you can do that without the aid of AI, and then you need someone to perceive that expression and empathize with it. If no one is there to see it, or if I am psychopath, I would probably say it doesn't have emotions. and for that, it is irrelevant/subjective. If you can empathize with characters in movies who "act" emotions, then you get my point.
You first need to express emotions, you can do that without the aid of AI, and then you need someone to perceive that expression and empathize with it.
If no one is there to see it, or if I am psychopath, I would probably say it doesn't have emotions. and for that, it is irrelevant/subjective.
If you can empathize with characters in movies who "act" emotions, then you get my point.
Share Improve this answer Follow answered Nov 27, 2016 at 1:36 user3874
Share Improve this answer Follow answered Nov 27, 2016 at 1:36 user3874
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
answered Nov 27, 2016 at 1:36 user3874
answered Nov 27, 2016 at 1:36 user3874
answered Nov 27, 2016 at 1:36
answered Nov 27, 2016 at 1:36
1 $\begingroup$ IMHO Definitely, yes! Everything that person feels (physically or mentally) can be discovered by chemical signals processing in his body or brain. If we understand the policy and nature of such signals, we can program it. There are a lot of pseudo-psychology and psychology works on this sphere, if you interested, I can suggest you: Cognitive Psychology (Robert L. Solso) describes cognitive apparat of human's mind in a simple words; The Psychology of Emotions (Carroll E. Izard) thorougly describes every kind of emotion by its looking on the human (both child and adult) face, low-level cognitive mechanism, related or adjacent emotions; Books by Paul Ekman ("Telling Lies", "Emotions Revealed", "Unmasking the Face") practical detecting of human emotions by microexpressions language on face and body. Upd 2023-01-05, to give more details A person forms their own emotions through their own experience, social norms and other factors. Every emotion has its own strength and can vary from person to person. People behave in a different ways on the similar problems. You can learn to control your emotions and not allow some of them to progress (like anger - as society teaches us to do). Need to mention that there are a lot of psychological problems that people could have like learned helplessness - that shows us that our experience can make us feel strange emotions as reaction to some triggers. And this is example that emotions like everyting else could be learned. More of that - you can control the emotions your AI will have. Of course it could be the question if AI needs emotions and what benefits it can have from them. Share Improve this answer Follow edited Feb 4, 2023 at 22:04 answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges $\endgroup$ Add a comment |
1 $\begingroup$ IMHO Definitely, yes! Everything that person feels (physically or mentally) can be discovered by chemical signals processing in his body or brain. If we understand the policy and nature of such signals, we can program it. There are a lot of pseudo-psychology and psychology works on this sphere, if you interested, I can suggest you: Cognitive Psychology (Robert L. Solso) describes cognitive apparat of human's mind in a simple words; The Psychology of Emotions (Carroll E. Izard) thorougly describes every kind of emotion by its looking on the human (both child and adult) face, low-level cognitive mechanism, related or adjacent emotions; Books by Paul Ekman ("Telling Lies", "Emotions Revealed", "Unmasking the Face") practical detecting of human emotions by microexpressions language on face and body. Upd 2023-01-05, to give more details A person forms their own emotions through their own experience, social norms and other factors. Every emotion has its own strength and can vary from person to person. People behave in a different ways on the similar problems. You can learn to control your emotions and not allow some of them to progress (like anger - as society teaches us to do). Need to mention that there are a lot of psychological problems that people could have like learned helplessness - that shows us that our experience can make us feel strange emotions as reaction to some triggers. And this is example that emotions like everyting else could be learned. More of that - you can control the emotions your AI will have. Of course it could be the question if AI needs emotions and what benefits it can have from them. Share Improve this answer Follow edited Feb 4, 2023 at 22:04 answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges $\endgroup$ Add a comment |
$\begingroup$ IMHO Definitely, yes! Everything that person feels (physically or mentally) can be discovered by chemical signals processing in his body or brain. If we understand the policy and nature of such signals, we can program it. There are a lot of pseudo-psychology and psychology works on this sphere, if you interested, I can suggest you: Cognitive Psychology (Robert L. Solso) describes cognitive apparat of human's mind in a simple words; The Psychology of Emotions (Carroll E. Izard) thorougly describes every kind of emotion by its looking on the human (both child and adult) face, low-level cognitive mechanism, related or adjacent emotions; Books by Paul Ekman ("Telling Lies", "Emotions Revealed", "Unmasking the Face") practical detecting of human emotions by microexpressions language on face and body. Upd 2023-01-05, to give more details A person forms their own emotions through their own experience, social norms and other factors. Every emotion has its own strength and can vary from person to person. People behave in a different ways on the similar problems. You can learn to control your emotions and not allow some of them to progress (like anger - as society teaches us to do). Need to mention that there are a lot of psychological problems that people could have like learned helplessness - that shows us that our experience can make us feel strange emotions as reaction to some triggers. And this is example that emotions like everyting else could be learned. More of that - you can control the emotions your AI will have. Of course it could be the question if AI needs emotions and what benefits it can have from them. Share Improve this answer Follow edited Feb 4, 2023 at 22:04 answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges $\endgroup$
IMHO Definitely, yes! Everything that person feels (physically or mentally) can be discovered by chemical signals processing in his body or brain. If we understand the policy and nature of such signals, we can program it. There are a lot of pseudo-psychology and psychology works on this sphere, if you interested, I can suggest you: Cognitive Psychology (Robert L. Solso) describes cognitive apparat of human's mind in a simple words; The Psychology of Emotions (Carroll E. Izard) thorougly describes every kind of emotion by its looking on the human (both child and adult) face, low-level cognitive mechanism, related or adjacent emotions; Books by Paul Ekman ("Telling Lies", "Emotions Revealed", "Unmasking the Face") practical detecting of human emotions by microexpressions language on face and body. Upd 2023-01-05, to give more details A person forms their own emotions through their own experience, social norms and other factors. Every emotion has its own strength and can vary from person to person. People behave in a different ways on the similar problems. You can learn to control your emotions and not allow some of them to progress (like anger - as society teaches us to do). Need to mention that there are a lot of psychological problems that people could have like learned helplessness - that shows us that our experience can make us feel strange emotions as reaction to some triggers. And this is example that emotions like everyting else could be learned. More of that - you can control the emotions your AI will have. Of course it could be the question if AI needs emotions and what benefits it can have from them.
Definitely, yes! Everything that person feels (physically or mentally) can be discovered by chemical signals processing in his body or brain. If we understand the policy and nature of such signals, we can program it.
There are a lot of pseudo-psychology and psychology works on this sphere, if you interested, I can suggest you:
describes cognitive apparat of human's mind in a simple words;
thorougly describes every kind of emotion by its looking on the human (both child and adult) face, low-level cognitive mechanism, related or adjacent emotions;
practical detecting of human emotions by microexpressions language on face and body.
Upd 2023-01-05, to give more details
A person forms their own emotions through their own experience, social norms and other factors. Every emotion has its own strength and can vary from person to person. People behave in a different ways on the similar problems. You can learn to control your emotions and not allow some of them to progress (like anger - as society teaches us to do).
Need to mention that there are a lot of psychological problems that people could have like learned helplessness - that shows us that our experience can make us feel strange emotions as reaction to some triggers. And this is example that emotions like everyting else could be learned. More of that - you can control the emotions your AI will have.
Of course it could be the question if AI needs emotions and what benefits it can have from them.
Share Improve this answer Follow edited Feb 4, 2023 at 22:04 answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges
Share Improve this answer Follow edited Feb 4, 2023 at 22:04 answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
edited Feb 4, 2023 at 22:04
edited Feb 4, 2023 at 22:04
edited Feb 4, 2023 at 22:04
edited Feb 4, 2023 at 22:04
answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges
answered Nov 27, 2016 at 19:20 barbariania 121 3 3 bronze badges
answered Nov 27, 2016 at 19:20
answered Nov 27, 2016 at 19:20
barbariania 121 3 3 bronze badges
121 3 3 bronze badges
0 $\begingroup$ Could an AI feel emotions? Anthropic published an interesting blog post on it yesterday (2026-04-02), tldr is the LLM they studied does not feel emotions, but it can develop internal functional analogues of emotions in human that influence its behavior. Anthropic's research shows that large language models contain internal patterns, sometimes called “emotion vectors”, that correspond to concepts like happiness, fear, or desperation. These are not subjective experiences; they are computational states that systematically affect outputs. This aligns with a functional view of emotion: emotions are not defined by “what they feel like ” but by what they do—mapping inputs to adaptive response. For more details, see Anthropic's: blog post paper Emotion Concepts and their Function in a Large Language Model (2026-04-02). YouTube video . Share Improve this answer Follow answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges $\endgroup$ Add a comment |
0 $\begingroup$ Could an AI feel emotions? Anthropic published an interesting blog post on it yesterday (2026-04-02), tldr is the LLM they studied does not feel emotions, but it can develop internal functional analogues of emotions in human that influence its behavior. Anthropic's research shows that large language models contain internal patterns, sometimes called “emotion vectors”, that correspond to concepts like happiness, fear, or desperation. These are not subjective experiences; they are computational states that systematically affect outputs. This aligns with a functional view of emotion: emotions are not defined by “what they feel like ” but by what they do—mapping inputs to adaptive response. For more details, see Anthropic's: blog post paper Emotion Concepts and their Function in a Large Language Model (2026-04-02). YouTube video . Share Improve this answer Follow answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges $\endgroup$ Add a comment |
$\begingroup$ Could an AI feel emotions? Anthropic published an interesting blog post on it yesterday (2026-04-02), tldr is the LLM they studied does not feel emotions, but it can develop internal functional analogues of emotions in human that influence its behavior. Anthropic's research shows that large language models contain internal patterns, sometimes called “emotion vectors”, that correspond to concepts like happiness, fear, or desperation. These are not subjective experiences; they are computational states that systematically affect outputs. This aligns with a functional view of emotion: emotions are not defined by “what they feel like ” but by what they do—mapping inputs to adaptive response. For more details, see Anthropic's: blog post paper Emotion Concepts and their Function in a Large Language Model (2026-04-02). YouTube video . Share Improve this answer Follow answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges $\endgroup$
Could an AI feel emotions? Anthropic published an interesting blog post on it yesterday (2026-04-02), tldr is the LLM they studied does not feel emotions, but it can develop internal functional analogues of emotions in human that influence its behavior. Anthropic's research shows that large language models contain internal patterns, sometimes called “emotion vectors”, that correspond to concepts like happiness, fear, or desperation. These are not subjective experiences; they are computational states that systematically affect outputs. This aligns with a functional view of emotion: emotions are not defined by “what they feel like ” but by what they do—mapping inputs to adaptive response. For more details, see Anthropic's: blog post paper Emotion Concepts and their Function in a Large Language Model (2026-04-02). YouTube video .
Could an AI feel emotions?
Anthropic published an interesting blog post on it yesterday (2026-04-02), tldr is the LLM they studied does not feel emotions, but it can develop internal functional analogues of emotions in human that influence its behavior. Anthropic's research shows that large language models contain internal patterns, sometimes called “emotion vectors”, that correspond to concepts like happiness, fear, or desperation. These are not subjective experiences; they are computational states that systematically affect outputs. This aligns with a functional view of emotion: emotions are not defined by “what they feel like ” but by what they do—mapping inputs to adaptive response.
For more details, see Anthropic's:
Share Improve this answer Follow answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges
Share Improve this answer Follow answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges
Share Improve this answer Follow
Share Improve this answer Follow
Share Improve this answer Follow
answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges
answered 1 hour ago Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges
Franck Dernoncourt 4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges
4,206 2 2 gold badges 22 22 silver badges 41 41 bronze badges
Start asking to get answers Find the answer to your question by asking. Ask question Explore related questions philosophy human-like ethics emotional-intelligence emotion-recognition See similar questions with these tags.
Start asking to get answers Find the answer to your question by asking. Ask question
Start asking to get answers Find the answer to your question by asking. Ask question
Start asking to get answers
Find the answer to your question by asking.
Explore related questions philosophy human-like ethics emotional-intelligence emotion-recognition See similar questions with these tags.
Explore related questions philosophy human-like ethics emotional-intelligence emotion-recognition See similar questions with these tags.
Explore related questions
philosophy human-like ethics emotional-intelligence emotion-recognition
See similar questions with these tags.
Linked 9 Can an AI learn to suffer? 4 Will we be able to build an artificial intelligence that feels empathy? 2 Will artificial super-intelligence evolve to have selfishness inherent in biological systems? Related 5 Can we destroy an artificial general intelligence without its consent? 20 What purpose would be served by developing AI's that experience human-like emotions? 7 Why was ELIZA able to induce "delusional thinking"? 6 How can we connect artificial intelligence with cognitive psychology? 9 Can an AI learn to suffer? 4 Can we detect the emotions (or feelings) of a human through conversations with an AI? 4 Will we be able to build an artificial intelligence that feels empathy? 5 Is there any paper, article or book that analyzes the feasibility of acheiving AGI through brain-simulation? Hot Network Questions how to ask for mentorship to have better clarity in research? How should I handle a request for an O-1 Visa letter for a mediocre RA? Semi-intelligent, super-advanced starship called "Grayle" Using GenAI for PhD Research: Productivity vs Enjoyment How to colorized any line drawed with draw (nor with plot expression)? What is this seabird in Deerness, Orkney Islands (fulmar, kittiwake or other)? Nicely organized graph representing Feynman Man at the end of the world type ironic story Proof of the first law of thermodynamics for high school students What are these things on Chell's legs? Position labels Venn diagram How might varying gravities affect plant life? Jn 18:37 : Which birth was Jesus referring to? When is it applicable to provide the contact details of the data protection officer under Art. 14 of the GDPR? How to choose critical p-value? How to best present an intermittent fault to mechanic? How public - like a frog? Examples of cocomplete categories without equalizers How to merge two parallel cylinders with clean topology Ezekiel 24:15-27 "..[Israelites]You will not mourn and you will not weep.." is more of a prophetic prediction from God hinting at Israelites' apathy What is this pen-like mouse-like device that connects to a probably-computer via a connector I dunno what it is? Do I Need Ear Protection During Classical Music Concerts Griffiths, Electrodynamics, Example 5.11 (Reverting vector potential to that of natural coordinates) Why is the word чтобы used in sentences like this? more hot questions Question feed
Linked 9 Can an AI learn to suffer? 4 Will we be able to build an artificial intelligence that feels empathy? 2 Will artificial super-intelligence evolve to have selfishness inherent in biological systems?
9 Can an AI learn to suffer? 4 Will we be able to build an artificial intelligence that feels empathy? 2 Will artificial super-intelligence evolve to have selfishness inherent in biological systems?
9 Can an AI learn to suffer?
4 Will we be able to build an artificial intelligence that feels empathy?
2 Will artificial super-intelligence evolve to have selfishness inherent in biological systems?
Related 5 Can we destroy an artificial general intelligence without its consent? 20 What purpose would be served by developing AI's that experience human-like emotions? 7 Why was ELIZA able to induce "delusional thinking"? 6 How can we connect artificial intelligence with cognitive psychology? 9 Can an AI learn to suffer? 4 Can we detect the emotions (or feelings) of a human through conversations with an AI? 4 Will we be able to build an artificial intelligence that feels empathy? 5 Is there any paper, article or book that analyzes the feasibility of acheiving AGI through brain-simulation?
5 Can we destroy an artificial general intelligence without its consent? 20 What purpose would be served by developing AI's that experience human-like emotions? 7 Why was ELIZA able to induce "delusional thinking"? 6 How can we connect artificial intelligence with cognitive psychology? 9 Can an AI learn to suffer? 4 Can we detect the emotions (or feelings) of a human through conversations with an AI? 4 Will we be able to build an artificial intelligence that feels empathy? 5 Is there any paper, article or book that analyzes the feasibility of acheiving AGI through brain-simulation?
5 Can we destroy an artificial general intelligence without its consent?
20 What purpose would be served by developing AI's that experience human-like emotions?
7 Why was ELIZA able to induce "delusional thinking"?
6 How can we connect artificial intelligence with cognitive psychology?
9 Can an AI learn to suffer?
4 Can we detect the emotions (or feelings) of a human through conversations with an AI?
4 Will we be able to build an artificial intelligence that feels empathy?
5 Is there any paper, article or book that analyzes the feasibility of acheiving AGI through brain-simulation?
Hot Network Questions how to ask for mentorship to have better clarity in research? How should I handle a request for an O-1 Visa letter for a mediocre RA? Semi-intelligent, super-advanced starship called "Grayle" Using GenAI for PhD Research: Productivity vs Enjoyment How to colorized any line drawed with draw (nor with plot expression)? What is this seabird in Deerness, Orkney Islands (fulmar, kittiwake or other)? Nicely organized graph representing Feynman Man at the end of the world type ironic story Proof of the first law of thermodynamics for high school students What are these things on Chell's legs? Position labels Venn diagram How might varying gravities affect plant life? Jn 18:37 : Which birth was Jesus referring to? When is it applicable to provide the contact details of the data protection officer under Art. 14 of the GDPR? How to choose critical p-value? How to best present an intermittent fault to mechanic? How public - like a frog? Examples of cocomplete categories without equalizers How to merge two parallel cylinders with clean topology Ezekiel 24:15-27 "..[Israelites]You will not mourn and you will not weep.." is more of a prophetic prediction from God hinting at Israelites' apathy What is this pen-like mouse-like device that connects to a probably-computer via a connector I dunno what it is? Do I Need Ear Protection During Classical Music Concerts Griffiths, Electrodynamics, Example 5.11 (Reverting vector potential to that of natural coordinates) Why is the word чтобы used in sentences like this? more hot questions
← Вернуться к списку
Может ли ИИ испытывать эмоции?
Краткое содержание
Представим, что человечество наконец создало первого антропоморфного искусственного интеллекта на основе человеческого мозга: будет ли он испытывать эмоции? Если нет, будут ли у него всё равно этика и/или моральные принципы?