(2021-02-06 23:58)Atlantis Wrote: As a theriomythic myself I think this is a conclusion a lot of us have come to and will understand perfectly well. I first discussed this with BearX and immediately found it to align more with my own experiences.
Of course the same people who's "Therianthropy" or "otherkinity" is based more on identity than experiences would also like to define what is and isn't Therianthropy based on identity rather than experiences.
I'm not saying it's bad to have a sense of identity, but it should be drawn based on experiences not trivialities or because it's "cool"
Do you see anything bad/harmful about terms like 'beast-like'/'human-like', wild/civilized, etc? I'm not sure how to differentiate what I meant between the two that doesn't also step into connotations of lacking worth for one over the other, or lacking respect. But it would sure help me explain even to myself what I'm seeing is as the difference between these groups. Maybe the issue is in me feeling like they have to have the same value, rather than being both valuable in very different ways?
On the topic of identity vs experiences:
I feel like... it really depends on the way in which you're either. And it's more likely that you're otherkin if you're focused on identity (a sort of complex, higher order interest/need/consideration) and more likely you're therian if you're focused on experience (a lower order, more instinctual and immediate reaction). I think it maybe "matters more" who you are to one type and "matters more" what you do to the other.
Not that there can't be exceptions, or people can't care a lot about both. Moreover, even as a therian, you're having a human experience. A human experience is a "complex, higher order" experience, so you may be inclined to care about identity because you're human now, too. But it makes sense that if you're beast-like in true self (a therian due to your thinking, your spirit, your soul, whichever is your understanding), you're going to be on the more experiential/doing side rather than just the identity/feeling side.
I am definitely on the side of the spectrum here that I would think is more common in otherkin.
(2021-02-07 2:14)Alliana Wrote: Yeah, I agree with Autumne and Atlantis when it comes to therians and otherkin:
Therians: More beastlike in terms of instincts and behavior (this includes beasts from mythology known as theriomythics)
Otherkin: Beings who have a civilization, sapience and how they conduct themselves (more humanlike).
But yeah, I agree with those two as well as Bear on the issue.
Same question as above, but I wanted to tag and ask you too: do you think there's anything wrong or problematic about 'human-like' vs 'beast-like' as terminology? Is my unwillingness to put animalistic traited beings in their own separate category for fear of making them 'less than' human people misplaced?
(2021-02-07 11:12)DustWolf Wrote: [...] For the AI to learn what it needs to learn in order to develop human-like intelligence itself, it will need to follow the same process [...]
Why is this considered to be the case? I don't understand the need/value/importance of an AI being recognizable as thing or being that functions as a person only if it develops as a human being does. Can you help be understand why that's regarded to be vital for this pursuit? Is my hang up that I differentiate between 'person' and 'human', whereas this field equates the terms?
(2021-02-07 9:35)kaiyoht Wrote: Or looking at it a different way, emotions and sensations in humans are just chemical reactions in the brain that people interpret as "feelings". They arise to serve a purpose: I get hungry because my body needs caloric energy to sustain itself; I get angry to defend my family/people/whoever from threat of harm (which, over time, makes my family more likely to survive). As cultures change and shift faster than biology does, sometimes that makes emotions feel kinda out of place (or nascent emotions come up, which are weird to handle because they aren't so well-understood). But it would be difficult for me to say that these chemical reactions and responses to stimuli that I have are so dramatically different from the sorts of input acceptors/reaction an AI might undergo to similar stimuli. And if I can't from the outside tell for sure that an AI is responding in some identifiable way differently than a person might, I don't know what else I can ask for to be more convinced.
I agree with you on the point of 'if it is functioning as if it is thinking and feeling, then... I'm not sure what else I need it to pass for me to feel compelled to accept that I am dealing with (and should be thinking in terms of) a thinking and feeling being.
One thing I thought about a lot a couple of years ago was a difference I saw between two types of feelings. It came to mind while I was in courses about language, and consequently thinking about what language as 'the operating system of the human brain' allows us to do. It facilitates so much of our processing of complex ideas that we just... it would be so difficult to grasp onto concepts for consideration without it.
The two categories were the 'base feelings' and the 'complex emotions'.
'Base feelings' were the ones that were simple and highly useful to survival/life/sustenance. They were things like 'angry', 'scared', 'happy'. The simple reactions one has to things that are troublesome, dangerous, or resulted in good reward.
'Complex emotions' were the states one only experienced because of nuanced thinking, extensive memory, and or seeing relationships between disparate things. 'Nostalgia', 'fulfillment', 'mourning', 'melancholy', or 'boredom' were things I felt were only accessible to those with the intelligence, memory, and ability to organize for processing (for humans, given by language).
What you were saying about the difference between biological emotions and newer emotions that occur because of changing society reminded me a lot of that.
Of course, I'll add the caveat that I also... have an uncomfortability with accepting that emotions are merely brain chemicals. I believe strongly that feeling is also a function of "the soul", and thus the way our "true selves" feel can be different than what our human brains are telling us we feel. Such thinking has become increasingly my belief as I interact with mental health and am aware of chemical means of altering the mood. I can't really defend that, and certainly not scientifically, though.