That kinda is up for debate. Carbon levels are definitely rising but increased carbon doesn't necessarily lead to a warmer planet.
A lot of climate change data is based on temperatures that include urban areas, which are warmer than rural areas. Over the past 100 years, urbanisation has increased dramatically, and therefore so has the ambient temperature in those locations. If we adjust for this by removing data points from urban areas, the warming is more gradual and doesn't follow an increase in carbon levels.
I have a Facebook book friend who is a global warming scientist and this is from some of his research (I hope he's ok with me sharing this):
Satellite data of the lower troposphere aligns with the rural data plot, rather than the total mean. We can see that temperature levels haven't been consistent with the increased carbon levels, which started to ramp up in the 1940s. In fact, there was a period of cooling from the 1940s to the 1970s.
He also explains that (and it gets quite technical), according to the theory of carbon dioxide based global warming, doubling CO2 should lead to a 1C change in surface temperature. However, CO2 has only increased from ~0.03% to ~0.04% since the late 19th century, whereas warming has increased by 1-1.5C. So according to the theory, man-made GW would only be about 20%-40% of the increase we've seen, yet models claim 100% is man-made. Looking at the modelling further, they incorporate some feedback loops based on increased humidity but digging into that shows they actually predict 3-5 times the warming that's occured. Some man-made cooling was then incorporated into the modelling to bring the figure back down. Basically, the models are retro adjusted to fit the data that keeps defying them.
I can ask him to come here for a Q&A if anyone is interested.