There are a few ways you can accomplish this. A regulator like others have posted about would be the best performing way (give the stablest output). A switching regulator will be more efficient than a linear regulator, and so stay cooler.
It would be possible to use a simple dropping resistor, too. However, to do that, you need to determine what the actual operating voltage and current of the light is by measuring it in operation with a meter. After that, it's a matter of applying a little math in the form of Ohm's law and related basic electrical principles.
Let's say that you measure 9V and 600 mA. (These are, of course, not actual measurements!) Your voltage supply varies between maybe 12V and 14.4V, so perhaps take 13.2 as an average. The resistor needs to drop 13.2V - 9V = 4.2V at 600 mA, which works out to a nominal value of 7 Ohms. It is dissipating 4.2V x 600 mA = about 2.5W; a 5W resistor would be appropriate. (Incidentally, a linear regulator will be dissipating the same amount of power under these conditions.)
At 14.4 V supply, this resistor would be dropping 5.4V (the LED's voltage is basically constant, at least for this sort of simple analysis), and the current would be about 770 mA, and the resistor dissipating a bit over 4W. The LED light would be overdriven somewhat as well, but if reasonably designed in terms of heat management should not have too short of a life. If you preferred, you could choose a higher value resistor and instead have it be dimmer under more typical 12V system voltages.
It's possible the LED unit may have a built-in regulator and not need any sort of special circuitry to connect to a 12V system. It's impossible to know that without examining and perhaps disassembling it. My hunch would be that it has no such design nicety.