This is one of those things where people frequently get confused.
First off, the whole 60Hz thing is pretty much universal to all LCD panels, because it refers to something that doesn't apply to LCD panels. All those TVs you see advertising 120Hz or higher are just basically blowing smoke, using a lot of really funny math to get those numbers.
Hz, or Hertz, refers to how many times a second a screen is repainted. This only applied to old CRT monitors and TVs where you had an electron gun going back and forth "painting" the screen. With LCD monitors the go-to statistic is RESPONSE rate, NOT refresh rate. Response rate is how fast any given pixel can go from white to black or the other way around. Above about 16ms, you can see some ghosting on games, but pretty much any monitor made in the last 6-7 years, if not longer, will have a response rate in the single digits.
The next thing to note is that refresh and/or response rate is NOT the same thing as fps. Not sure where that idiotic idea got started, but it's complete and total BS from start to finish, plain and simple. You could have 500fps on a 60Hz display, no problem. Of course there's one other very important thing to factor in, which is the part of your brain that processes visual information from your brain. In the average person, it works out to roughly 72fps being the saturation point. So unless you have exceptionally good eyesight, even by the standards of fighter pilots, everything after 72fps will have a rapidly diminishing impact on your perception of the smoothness of the animation. The whole "I got XXXfps!" thing is little more than an e-penis measuring exercise for gamers. Put another way, most movies run at a framerate of 23fps and "live" TV is more like 29.97fps. Do you notice any particular issues with the illusion of motion there? No? Then why are you worrying so much about a game?
One thing that CAN artificially limit your fps is v-sync. This forces the video card to operate at the speed of the monitor. This eliminates screen tearing: those ugly jagged lines that run diagonally across the screen. nVidia and AMD have now developed adaptive v-sync features in some of their newer cards, which SHOULD give you the best of both worlds, minus some small overhead of course. Of course IMO, v-sync gets a bad rap. So what if it limits the FPS of your game? As long as what you're getting is smooth animation, don't worry about the rest. It doesn't mean that your video card doesn't then have plenty in reserve to maintain a steady 60fps or whatever, which is frankly a lot more important than being able to hit a peak FPS of something in the triple digits. Unless you live somewhere where it doesn't snow, so you can't prove your manliness by writing your long complex name in the snow after only one large glass of water.