This is what ChatGPT is supposed to be good at
This morning I had a question that I would normally post online. It was as follows:
Ok, I need a little CSS help. I have a structure like this:
|
|
I want to add a margin after the last .div.post-item
prior to an H2. I thought this would work:
|
|
It does not. Advice?
I would have gotten an answer, for sure. But ChatGPT gives me an answer and really taught me something. Here’s what I got back in an instant.
ChatGPT’s wrong answer
The
:last-child
pseudo-class selects the last child element of its parent, regardless of its class. So in your case, it’s selecting the last childdiv
element, not specifically the last.post-item
element.To achieve what you want, you can use the adjacent sibling combinator (
+
) to select theh2
element that follows the last.post-item
. Here’s how you can do it:
1 2 3
.post-item:last-child + h2 { margin-top: var(--size-step-2); }
This CSS rule will select the
h2
element that immediately follows the last.post-item
element and apply the margin to it.
The problem is, this doesn’t work. It’ll keep telling me over and over again that .post-item + h2
or other variants will work. None did. I could do a broader selector (div + h2
works for my purpose, though I would prefer to have been more selective).
I still need my brain
The problem I actually had was structural. My page looked like this:
|
|
And although that looks hierarchical, it’s not. The correct “fix” in my opinion, was to change the DOM structure to look like this:
|
|
And then use the following CSS:
|
|
If I had asked my question on social media, would people have given me a better response? If I had Googled, would I have found my answer?
I’m not really sure if ChatGPT’s false direction saved or cost me time. But I do know that even relatively straightforward seeming coding questions are often hard to answer. Problems are tricky, and making them minimally reproducible and asking the right question is the hardest part. Looking at my page structure and realizing that it was too flat to use selectors how I wanted required understanding and intuition, some of which I had already, and some of which I had to learn and reinforce while playing around with my own site in context.
I have used Copilot a bit and been pretty impressed with a narrow set of specific tasks. But so far at least, most of what I’ve been able to get out of language models for coding is just a tiny bit more than a rubber duck. It’s like halfway between talking to myself and having a real person sit with me for fifteen minutes. Maybe that’s enough to be incredibly valuable. But so far, it’s kind of on the margins. When it hits, it’s magic. But you only get the magic once you know enough to ask really precise, smart, correct questions. And at my stage in my career and with my current knowledge, by the time I can confidently ask the right questions, the problems are as good as solved.