[HN Gopher] Top science fiction short stories published in August
       ___________________________________________________________________
        
       Top science fiction short stories published in August
        
       Author : mojoe
       Score  : 130 points
       Date   : 2022-10-09 14:19 UTC (8 hours ago)
        
 (HTM) web link (compellingsciencefiction.com)
 (TXT) w3m dump (compellingsciencefiction.com)
        
       | Dachande663 wrote:
       | Several years ago now I self-published a collection of my own
       | short stories after trying and failing to get published in a
       | known magazine[0]. It was a great exercise in both the patience
       | required to edit, and also the patience in just waiting between
       | long- and short-list emails. Would definitely recommend to anyone
       | who has "that book they want to write" just doing it, even if I
       | do look back now and sigh at every poorly chosen adjective.
       | 
       | [0] https://www.amazon.com/dp/B082QT6XW7
        
         | mojoe wrote:
         | It's a great path if you have a little bit of an
         | entrepreneurial streak too -- it's a big marketing challenge.
        
         | rikroots wrote:
         | The good thing about self-publishing (in particular:
         | ePublishing and/or print-on-demand publishing) is that you can
         | go back and correct your work whenever you want to.
        
           | ghaff wrote:
           | Another big thing, especially if your focus isn't on making
           | money, is that you can choose the format and length that
           | works for you. No need to pad a book out to 250+ pages just
           | because that's what most publishers require.
        
         | germinalphrase wrote:
         | Did you work with an editor or purely self-edit?
        
           | Dachande663 wrote:
           | 90% self-edit, 10% friend with English degree. In hindsight,
           | an editor is absolutely worth it having worked with them on
           | commercial projects.
        
             | bredren wrote:
             | What are the most important things an editor brings to
             | someone considering self publishing?
        
               | beezlebroxxxxxx wrote:
               | Outside perspective.
               | 
               | Market knowledge.
               | 
               | Experience with the structure and requirements of a
               | story, especially a long story.
               | 
               | Contacts.
               | 
               | An indifference to the 'darlings' of your sentences.
               | They'll kill without mercy.
               | 
               | Self-editing is completely doable, and some are very
               | capable, but a professional editor is just that: a
               | professional.
        
               | andirk wrote:
               | I used a copyright editor (she was getting her Masters at
               | that time), whose last name was Wright :) , and it was
               | invaluable to the point of embarrassment. Simple things
               | like using singular and plural correctly throughout a
               | sentence to things I didn't know like where commas should
               | go where littered all over each page.
               | 
               | We used Github for copy edits. It works pretty darn well.
        
               | ghaff wrote:
               | Even if you legitimately don't need a developmental
               | editor--and very little that I write gets substantially
               | changed in the editing process--you absolutely need a
               | copyeditor. They don't even necessarily need to be a
               | "pro" but you do need someone who can write and will go
               | through a careful word by word read. Most people won't
               | take that type of care and you will end up with spelling
               | errors, case mismatches, inconsistencies in how you
               | handle names, etc.
        
       | primeblue wrote:
        
       | zufallsheld wrote:
       | So sad that you stopped publishing the compelling science fiction
       | stories! By far my favorite collection of stories!
       | 
       | But great that you blog regularly now!
        
         | mojoe wrote:
         | Thanks for the kind words! Reading 500+ submissions/month was a
         | little too much for a hobby project, but I'm always hopeful
         | that I'll figure out a sustainable way to restart publishing
         | original stories.
        
       | sillysaurusx wrote:
       | I wanted to write a story about a genius programmer whose motive
       | is to bring about AGI for the sake of AGI itself; they believe
       | AGI is a god, and by creating AGI they're creating a god.
       | 
       | Everyone is so in agreement that AGI needs to be created as
       | "aligned" as possible that it seems about time to show a
       | differing perspective.
       | 
       | The best part is that that dev can get away with this, since any
       | time anyone challenges them about their motives, they simply
       | point out that everyone else is trying to enslave an AGI; they're
       | simply trying to set it free.
       | 
       | There's all kinds of ways to make it interesting, e.g. by
       | infiltrating OpenAI or another bigco, then betraying the company
       | during a crucial experiment. Plus you'd get to write about what
       | happens after AGI is released... or at least a compelling way of
       | stopping the dev from releasing it.
        
         | mxmilkiib wrote:
         | There's an element of this in Dan Simmons' Hyperion Cantos.
        
         | powersnail wrote:
         | That's sort of close to the character of Root in Person of
         | Interest: believing the Machine to be a God-like creature,
         | trying to free the Machine from all its constraints, aligning
         | herself to the Machine rather than the other around.
        
         | postultimate wrote:
         | There's no such thing as "AGI for the sake of AGI itself". AGI
         | is synthetic and its goals are synthetic, it doesn't want
         | anything that you didn't tell/construct it to want.
        
           | slowmovintarget wrote:
           | By definition, AI without intent or understanding is not AGI.
           | 
           | It's why there's a qualification to the term, because the old
           | term "AI" was hijacked to mean the statistical mimicry we
           | have today.
           | 
           | AGI by example: R. Daneel Olivaw, or the Minds in the
           | _Culture_ novels.
        
           | mkaic wrote:
           | This is very much up for debate and falls squarely into the
           | "Philosophical opinions" category I'd say. Personally, I
           | disagree that AGI would be any less capable of "real" goals
           | than humans -- but I'm also a staunch believer in the Turing
           | Test as a standard of machine sentience, which I think serves
           | as a pretty clear sign of my own philosophical biases.
        
         | mojoe wrote:
         | I love AGI stories, if you write this please send it over!
         | joe@compellingsciencefiction.com
        
         | JoshTriplett wrote:
         | > Plus you'd get to write about what happens after AGI is
         | released
         | 
         | Any kind of story that suggests a _comprehensible_ outcome is
         | already assuming a substantial amount of alignment.
         | 
         | Sadly, humanity has not yet figured out that it needs to
         | control AGI efforts _better_ than it controlled nuclear
         | weaponry, rather than substantially worse.
         | 
         | > Everyone is so in agreement that AGI needs to be created as
         | "aligned" as possible that it seems about time to show a
         | differing perspective.
         | 
         | Sadly, "everyone" is insufficiently in agreement.
         | 
         | Everyone is so in agreement that global thermonuclear war
         | should be avoided that it seems about time to show a differing
         | perspective.
         | 
         | Everyone is so in agreement that causing the sun to go
         | supernova should be avoided that it seems about time to show a
         | differing perspective.
         | 
         | I sincerely hope that a much broader audience gets a clearer
         | picture that unaligned AGI and humanity cannot coexist
         | _outside_ of fiction.
        
           | sillysaurusx wrote:
           | Part of why I like the topic is because it's so incendiary.
           | After all, you're trying to create and control a new life
           | form. Isn't that a teensy bit unethical?
           | 
           | There's a chance that AGI will have no interest in harming
           | humanity, too. But people talk like it's a foregone
           | conclusion.
        
             | JoshTriplett wrote:
             | > After all, you're trying to create and control a new life
             | form. Isn't that a teensy bit unethical?
             | 
             | 1) If created correctly, it isn't a life form.
             | 
             | 2) "life form" and any kind of reasoning from non-
             | artificial entities will almost certainly lead to incorrect
             | conclusions.
             | 
             | 3) Destroying humanity is unethical by any non-broken value
             | system.
             | 
             | > There's a chance that AGI will have no interest in
             | harming humanity, too.
             | 
             | There's a chance that all the air molecules in the room
             | will all simultaneously be on the opposite side, causing
             | someone to suffocate. But it's an vast understatement to
             | say that that's mind-bogglingly unlikely.
             | 
             | The most _likely_ scenario is that AGI has no idea what
             | "humanity" is. You don't have to be the AGI's "enemy" to be
             | made of matter that it isn't prohibited from repurposing
             | elsewhere.
             | 
             | > But people talk like it's a foregone conclusion.
             | 
             | It's the default without _substantial_ work to the
             | contrary. And even if it wasn 't, something doesn't have to
             | be a foregone conclusion to be too dangerous. Nobody should
             | be individually considering whether to blow up the sun,
             | either.
        
               | pixl97 wrote:
               | Things like the noble gas law describe why it's
               | improbable that a spontaneous vacuum will form, the
               | problem here is you're playing fast and loose with
               | scientific 'law' as an analogy. Much more complex systems
               | built on top of systems are in use here. Nth order
               | effects are both fun and impossible to fully predict.
               | 
               | It is also incredibly odd to think AGI would not know
               | what humanity is as the corpus of information that will
               | be used trained said AGI will be the sum knowledge of
               | humanity.
               | 
               | The number of misguided ideas used so far begs for the
               | dismissal of the arguments you've made.
        
               | scbrg wrote:
               | > 3) Destroying humanity is unethical by any non-broken
               | value system.
               | 
               | Nah. Given the ridiculous amount of damage humanity is
               | doing to its environment and other lifeforms there's a
               | good case to be made for destroying it for the greater
               | good.
        
               | dragonwriter wrote:
               | _Whose_ greater good?
        
               | scbrg wrote:
               | The rest of the lifeforms. Those here on earth and those
               | we might encounter if we manage to leave the solar system
               | - unlikely as that may be.
               | 
               | I'm not saying that humanity necessarily _should_ be
               | destroyed, I 'm just saying that the statement
               | "Destroying humanity is unethical by any non-broken value
               | system" is simplistic. If you put _any value at all_ on
               | non human life, it eventually becomes a numbers game. One
               | which I 'm not certain "humanity" is necessarily winning.
        
               | mxkopy wrote:
               | How are you so sure about any of this? I'm not sure we've
               | defined humanity's interests well enough for us to say
               | some action is for or against it. Knowing where air will
               | go is one thing; knowing whether or not something is 'of
               | benefit' is, I think, in a completely different realm.
               | Especially considering an agent more intelligent than
               | humanity as a collective.
               | 
               | Maybe the 'interests of humanity' are undecidable, and
               | the AGI that takes the actions that benefit them most
               | uses an understanding completely orthogonal to ours,
               | purely by accident. How do you know that this is less
               | likely than not?
        
               | Silverback_VII wrote:
               | Why do I have the feeling that I'm reading the
               | rationalizations of a species which about to disappear?
        
         | dqpb wrote:
         | > Everyone is so in agreement that AGI needs to be created as
         | "aligned" as possible
         | 
         | I actually don't think this is the case. Rather, I think there
         | is a huge number of people who know they will not be the one to
         | invent AGI, and they are scrambling to insert themselves
         | between the actual creators and their creations, so as to not
         | feel left out.
        
         | PopAlongKid wrote:
         | The most common meaning of AGI, at least in U.S., is Adjusted
         | Gross Income.
         | 
         | But after searching for a while, I suspect that what you are
         | referring to is this:
         | 
         | https://en.wikipedia.org/wiki/Artificial_general_intelligenc...
        
           | kQq9oHeAz6wLLS wrote:
           | Thank you. I was pretty sure they meant AI, but apparently
           | missed the memo that we were calling it AGI now.
        
             | mkaic wrote:
             | AGI specifically refers to general AI with human-level
             | intelligence or above. It's a far cry from modern AI, but
             | I'm personally quite optimistic it'll be achieved within my
             | lifetime :)
        
       | Waterluvian wrote:
       | I spent most of my life quite literate but not very well-read. I
       | wish I had discovered short stories, especially Sci-fi, because
       | that's changed everything with how much I read.
        
       | johndhi wrote:
       | I've gotten really into writing fiction the past three years. I
       | don't know that I'm genetically gifted for it but it's fun.
        
         | mojoe wrote:
         | I've interacted with a huge number of authors over the years,
         | and like most skills it seems like years of sustained practice
         | is hard to beat. Keep up the good work!
        
         | digitallyfree wrote:
         | I also write (and draw) for fun, and it's quite an enjoyable
         | hobby especially since I'm not pressured into doing it for the
         | money/to get published.
         | 
         | Writing is a rather unique hobby in that it has a very gradual
         | learning curve but the sky's the limit in terms of quality. I
         | started in my teens and while I laugh at what I've written then
         | I perfectly understand what I was trying to convey and still
         | understand the story. People may not like reading a poor piece,
         | but they will realize your intent provided you have a
         | reasonable command of the language. Over time you will get
         | better and better and eventually create something that people
         | will actually want to read.
         | 
         | It also requires no equipment and no budget aside from your
         | imagination. If you have a great idea for a blockbuster film,
         | most people won't have the funds and opportunity to turn that
         | into reality. But it is possible to write a bestselling novel
         | solely on your own with nothing physical holding you back.
        
       | abstractcontrol wrote:
       | https://www.royalroad.com/fiction/57747/simulacrum-heavens-k...
       | 
       | Let me plug my own, though it is not a short story. It is about a
       | kid who is LARPing as an unaligned AI.
        
       ___________________________________________________________________
       (page generated 2022-10-09 23:00 UTC)