I know this is already super well known to most people but it was so easy to implement, I just HAD to share it.

Basically, find the distance between two points and if that distance is shorter than the the two radii added together, they intersect.

To implement in your favorite programming language, simply subtract two, 2D vectors to get the height and width of a right-angled triangle. Then use Pythagoras to derive the length.

Add the two radii together and compare it to the length. If length is less than the sum of the radii, assume they intersect and a collision has happened.

Basically, this psuedocode will do it

[c]
double dx, dy, rt;
rt = radius1 + radius2;

if (dx*dx + dy*dy < rt*rt) {
//There is collision
}else{
//There is no collision
}
[/c]