^ Actually, that should have been there since it’s part of the “like” syntax. What I forgot was to close the entire statement. The line should be:
and (v_Network_DATA_Serialized.IPAddress0 like(‘192.%’))
and (v_Network_DATA_Serialized.IPAddress0 like’192.%’)
^ also works, but a good DBA would slap you for sloppy syntax like that.
How long did the query take before you added the additional join? And the big question, is this something that will be run constantly by people using the software or something that gets run once and a while.
What I’m getting at is if there is a one guy who runs this once a day and the query went from 300ms to 2 seconds it’s no big deal. If there’s 1000 people hitting this query 100’s of times per day and it went from a sub second response to 2 seconds you could seriously hammer the database.
I came from a hosting place where the developers had free reign over their own queries and brought the main database to it’s knees as the company grew and the datasets got huge. We spent months identifying all the poorly written, improperly indexed tables and queries in order to undo the damage.
The query runs a report in SCCM which of course pulls from SQL. There are about 4-5 people on campus, including myself, that even have the ability to see these reports let alone run them. This particular report will probably only be a once in a while type thing. Our new CIO wanted a list of our computers on campus along with their locations. Using the IP Addresses I can identify which building they are in. We only run reports once in a while to see how our SCCM packages success/fail rate looks, get inventory on software and/or hardware, and some other miscellaneous info. The above report was built-in with SCCM and written entirely my M$ (aside from my little contribution). I’ve just started manipulating reports to get data that I want that isn’t built in to SCCM.
Its really entertaining how fast you learn proper SQL syntax when you start dealing with larger databases. I used to leave a lot of open ended and SELECT * FROM statements when dealing with tables of a few columns and maybe 100-200 rows but as soon as you start working with 500,000 rows and doing joins of about 50 columns, I quickly learn SQL optimization. Haha
IMO, it’s difficult to learn unless you have interesting challenges crop up on a regular basis AND a large dataset that you know your way around…or you’re a full time developer. In my previous job I got exposed a little bit, but only in emergency situations.
Hell, get a Twitter Streams account and free AWS RDS database. Pour a common term into your DB and just run queries for fun. Quick way to have a million entries in a DB and the compute power to process your queries.
Come to think of it, maybe I just don’t have a life…