Quantcast
Channel: Jeff Kaufman's Writing
Browsing latest articles
Browse All 153 View Live

Examples of Superintelligence Risk

In talking to people who don't think Superintelligence Risk is a thing we should be prioritizing, it's common for them to want an example of the kind of thing I'm asking about. Unfortunately, I have...

View Article


Conversation with Bryce Wiedenbeck

A few days ago I spoke with Bryce Wiedenbeck, a CS professor at Swarthmore teaching AI, as part of my project of assessing superintelligence risk. Bryce had relatively similar views to Michael: AGI is...

View Article


Image may be NSFW.
Clik here to view.

Window Vent

During the summer (here) night-time outdoor temperatures are typically pretty pleasant, but sleeping inside can still be uncomfortable because of heat stored from during the day. The normal fan-based...

View Article

Technical Distance to AGI

I think the largest piece of the disagreement between ML people over how to prioritize AI risk may turn out to be: how likely is it that we get AGI soon that looks a lot like systems we have now?...

View Article

Superintelligence Risk Project Update

I've now been working on my project of assessing risk from superintelligence for a little over a week, though I was traveling for the end of last week. To keep me motivated, and let other people...

View Article

Browsing latest articles
Browse All 153 View Live