If you've ever read Heinlein's The Moon is a Harsh Mistress, you're no doubt familiar with the character of Mike -- a computer who was loaded up with memory space in order to handle every aspect of the Moon's organization, and then simply "woke up" for reasons that remained unclear. As we have no idea what sort of thing it is that makes humans sentient (pre-ghosts? Our beautiful hearts? The fact that we're just naturally badass?,) the number of neurons in the brain might be a possibility -- albeit a slim one, as the victims of brain-dessicating diseases such as Alzheimer's merely seem to lose touch with everything, not sentience.
I find it doubtful that machines will ever think, though. Let me use a metaphor an old friend of mine once used:
Let's say you have a box. Inside the box are slips of paper with words on them. One may speak to the box, reach inside, and pull out the reply on a slip of paper. Let's assume that one will always obtain the right piece of paper -- that is, that someone who greeted the box "hello" would get a "hello" in return. Let's say you load that box with slips of paper with replies to any imaginable sentence spoken -- the box can now voice (via the slips of paper) an appropriate reply for anything that could conceivably be said. Does this mean the box is sentient? Does the box at any point comprehend the slips of paper inside it as anything more than the response to a cause-effect system of "words => paper?"
However, I'm going to contradict myself again. A computer program runs according to a set of stimuli and responses; if x, then y. This is very much the way the individual neurons of the brain function: if x, transmit y. If y, inhibit x. I'm oversimplifying things, but I'm sure you get the point. Since your standard neuron operates with nothing more than machine precision, whatever key to sentience there is must lie in the arrangement of the neurons -- the way they interact. If we were somehow to create hundreds of isolated program cells that would release stimulants and depressants to select other cells adjacent to them, and arrange them in a way identical to the arrangement of neurons in the brain, we might be on to something.
Of course, the brain is incredibly more complex than I'm making it sound, and if you were to do something like the above method, I'd wage you'd get a computer with very much potential but no useful application and no simple means of input. Maybe this method is already used in programs.
Alright, I've stated two very different opinions on the matter directly next to each other, so I feel I owe myself a little clarification: While it's true that we don't understand the mind well enough to identify what separates us from beasts and machines, I'm generally very doubtful of the idea on a fundamental level. It just doesn't seem realistic, although I can't convince myself why.
(What's convinced us we're so sentient, anyway? I mean, I think I'm sentient, but I also know more than a handful of people out there think they're Napoleon.)