1. I think that life beyond Earth is likely to be uncommon, if it exists at all.
2. If life does exist beyond Earth -- even in vast quantities -- I still think that complex life is probably rare.
3. It's unlikely that we'll ever leave the solar system. In all likelihood, if we're still around at the end of the sun's life, we'll die with it.
4. It's unlikely that we'll ever accomplish anything of importance with respect to the overall processes of evolution and life on Earth.
5. The concept of a multiverse seems completely laughable to me -- not because I think it's impossible, but merely because it seems untestable and, in all likelihood, irrelevant to anything that we do with our time here.
6. I highly doubt Ray Kurzweil's claims of an impending technological singularity, and think that he has greatly misapplied several key variables, while potentially ignoring many others.
7. Artificial intelligence is likely a very, very difficult and expensive endeavor. I don't expect that it will occur in my lifetime.
8. We won't know the ultimate outcomes of any of the above unless we give them a try. Furthermore, we should abandon any tests, implementations, designs, or plans should they prove too costly, or even detrimental to sentience.
The problem, to me, is that I don't know what will be possible billions of years in the future, given the enormous number of variables involved in human consciousness and its physical manifestations, so anyone who claims to know for certain what will happen over such preposterous time spans, then proceeds to declare humanity in need of disappearing from the universe, strikes me as someone who has drawn a premature conclusion. "Of COURSE nothing important is going to happen elsewhere. This place sucks; let's kill ourselves, leaving the universe to its own devices, because that plan will eliminate our suffering, and we already know that nothing more important will ever happen anywhere, ever" just doesn't cut it for me.
Don't worry; I'm not a Jew-hating Holocaust denier.