After clinching a job, which one will most likely ensure that you stay employed, grow in your career and make a decent living? When it comes to getting your foot in the door, the debate on whether work experience is more valuable than higher education or vice versa, is always something to consider. Some career experts argue that higher education only proves that you are book smart, but you might not be street smart when it comes to the real world of work. Others imply that a higher degree ensures that you acquire a particular set of skills that are useful in the office space. But what does work trends and a look at the workplace say?