Macallik86

Superstar
Supporter
Joined
Dec 4, 2016
Messages
6,674
Reputation
1,503
Daps
21,870
Just remember temp tables should be the very 1st tool you reach for.




You should be able to use temp tables for the 1st 3 bullet points.
Temp tables are actually the last thing they want us to reach for 😅. We always forget to drop the temp tables and the DB admin gets frustrated when he has to ask us about a bunch of tables taking up space before he deletes em.
(I did end up using a temp table for #2 btw)

if you use linux/unix and you have sql client command line connectivity to your target you could just use cron on your local machine.


Use something like sqsh (if possble) to pipeline scripts
I pretty much exclusively use SSMS for work. I might have access to some type of shell commands tho on my work pc.

I did finish that course on Linux which walked through the basics of cron but it's been a while. So far I've been able to create user-defined functions and stored procedures through the access-rights I have at work. I'm like 90% sure I won't have job-scheduling abilities tho so I will look into cron as a workaround if possible. Otherwise I will just have to manually kick things off each month. I could ask for access and get it as a last ditch option, but I want to automate things at work without alerting management that I'm automating things. I don't want more work sent my way thanks to my increased efficiency
 

Secure Da Bag

Veteran
Joined
Dec 20, 2017
Messages
41,326
Reputation
21,368
Daps
129,535
Temp tables are actually the last thing they want us to reach for 😅. We always forget to drop the temp tables and the DB admin gets frustrated when he has to ask us about a bunch of tables taking up space before he deletes em.
(I did end up using a temp table for #2 btw)

Create local temp table in stored procedure in SQL Server
A local temporary table in SQL Server remains visible and accessible till the duration of the connection. Once the connection is closed the local temp table will be automatically deleted (dropped).

But yes, if you use one in an "new query", it could sit around until your session is closed.
 

null

...
Joined
Nov 12, 2014
Messages
30,230
Reputation
5,234
Daps
47,890
Reppin
UK, DE, GY, DMV
Temp tables are actually the last thing they want us to reach for 😅. We always forget to drop the temp tables and the DB admin gets frustrated when he has to ask us about a bunch of tables taking up space before he deletes em.
(I did end up using a temp table for #2 btw)

you are using MS sql server right? With sybase adaptive server (SAP?) for example a temp table with name starting with '#' gets removed when the client logs out.

I pretty much exclusively use SSMS for work. I might have access to some type of shell commands tho on my work pc.

"sqlcmd" maybe. saying that microsoft is microsoft and chaining, cron, scripting everthing (rather than clicky clicky) are borrowed imports.


I did finish that course on Linux which walked through the basics of cron but it's been a while. So far I've been able to create user-defined functions and stored procedures through the access-rights I have at work. I'm like 90% sure I won't have job-scheduling abilities tho so I will look into cron as a workaround if possible. Otherwise I will just have to manually kick things off each month. I could ask for access and get it as a last ditch option, but I want to automate things at work without alerting management that I'm automating things. I don't want more work sent my way thanks to my increased efficiency

are you using windows/linux? do you have bash?

type 'which crontab'

try crontab -l and see if anything sensible happens.

do you have root/admin on your local machine?

as a less-than-ideal workaround you could look at cygwin but it is not a good implementation.

if no to the above amd you are on windows and have a git client installed check if you have git bash shell as well


cygwin and git bash should both have 90-odd percent level bash which should be enough to do sensible scripting (i.e. no command script no powershell)

if push came to shove you could run it out of a virtual linux machine where you will have bash and crontab.

you will need to make sure that the VM can see (reach) your database server.

 

Macallik86

Superstar
Supporter
Joined
Dec 4, 2016
Messages
6,674
Reputation
1,503
Daps
21,870


But yes, if you use one in an "new query", it could sit around until your session is closed.
Strange. I was actually taught using the #TempTables when I first learned SQL w/ my company, but 2-3 years ago, my whole department was told to shift over to the 'Select INTO' version instead. Must've been some type of conflict with the tables somehow... maybe blocking data refreshes running concurrently that NOLOCK couldn't resolve? I have no idea but I'll ask it in the next team meeting w/ the Reporting Dept

@brehsimilitude I've still got a ways to go before I get to automating the runs of my reports but I will double back to your post if I run into any issues. I plan on spending ~1 hour a day working on automating tasks but I'm still fleshing out and repurposing code for now in addition to fielding daily ad-hoc requests such as the one referenced above.
 

Secure Da Bag

Veteran
Joined
Dec 20, 2017
Messages
41,326
Reputation
21,368
Daps
129,535
Strange. I was actually taught using the #TempTables but were told to shift over to the 'Select INTO' version instead. Must've been some type of conflict with the tables somehow perhaps or maybe blocking dataloads? No idea. Wonder if there's some conflcit around NOLOCK that #TempTables cannot work around?

Strange. Select INTO creates permanent tables. If you're using that to move data around lives past the session, then may consider global temp tables. But those you HAVE to make sure you drop.
 

null

...
Joined
Nov 12, 2014
Messages
30,230
Reputation
5,234
Daps
47,890
Reppin
UK, DE, GY, DMV
@brehsimilitude I've still got a ways to go before I get to automating the runs of my reports but I will double back to your post if I run into any issues. I plan on spending ~1 hour a day working on automating tasks but I'm still fleshing out and repurposing code for now in addition to fielding daily ad-hoc requests such as the one referenced above.

is it linux/windows?
 

null

...
Joined
Nov 12, 2014
Messages
30,230
Reputation
5,234
Daps
47,890
Reppin
UK, DE, GY, DMV
Strange. Select INTO creates permanent tables. If you're using that to move data around lives past the session, then may consider global temp tables. But those you HAVE to make sure you drop.

i think the syntax is as follows:

in ms sql server:

select cols into persistent_table from original_table

select cols into #session_temp_table from original_table

select cols into ##global_temp_table from original_table

in sybase ASE:

select cols into persistent_table from original_table

select cols into #session_temp_table from original_table

select cols into temdb..global_temp_table from original_table

-

can't test any of that because no access now
 

Macallik86

Superstar
Supporter
Joined
Dec 4, 2016
Messages
6,674
Reputation
1,503
Daps
21,870
Randomly decided to connect VS Code to SSMS on my personal computer as a side project today.

Unfortunately, since my VS Code is intertwined w/ WSL2, it apparently has issues interpreting localhost thru the virtualization and the mssql extension wouldn't take. However....

...I started digging around for solutions and I was today years old when I found out about Azure Data Studio. On Medium, someone described it as 'VS Code for SQL Databases' and it is a Microsoft-forked version of VS Code with clear emphasis on databases.

I'm digging the sleek, modern, lightweight IDE that has Dark Mode, more extensions, more customizations, etc. Looks to be geared towards analysts more than DBAs at the moment FYI. Definitely going to put a request in to have it installed on my work PC tomorrow if it isn't already there

Edit: Took an hour of learning/troubleshooting/port config, but now I can use Azure Data Studio on my Chromebook to query databases on my desktop :banderas:
 
Last edited:

Serious

Veteran
Supporter
Joined
Apr 30, 2012
Messages
80,271
Reputation
14,339
Daps
191,136
Reppin
1st Round Playoff Exits
Google Data Analytic Certification Review

Overview: Good if you have no experience, not prior job knowledge. Not useful as an IT cert. Need further studying for a job

Price: is like 39$ a month.

Time line: I finished it in two weeks, but I didn't officially finished until 3 months due to procrastination on the capstone project.



Skills taught: Tableau, Google Spreadsheet(Excel), R, SQL
Also, job stuff, Data thinking


Overview:
The course outline is this:
1. Foundations: Data, Data, Everywhere:
2. Ask Questions to Make Data-Driven Decisions:
3. Prepare Data for Exploration:
4. Process Data from Dirty to Clean
:
5. Analyze Data to Answer Questions:
6. Share Data Through the Art of Visualization:
7. Data Analysis with R Programming:
8. Google Data Analytics Capstone: Complete a Case Study:


The course mainly goes over a selection of topics: Making yourself marketable in the job market, workplace etiquette in data, Thinking in a data analytical way, Spreadsheets, SQL, R, and Tableau (and storytelling). Google has a 6 step process for data that you can look up.

Briefing talking about my backgrounds: I have a economics background, and currently worked in the data field, but was not using programming language at work as a recently post Bach grad. Was/am currently doing my master while doing cert. I learned R through school, and excel proficiently through work. I learn SQL through data camp prior to doing this cert.

Thus, why my timeline was very short. However, if you take this like you would going to high school, you could finished this in the same rate.



I think that the topics cover were great to trash:
SQL: the basics of SQL was pretty good. This is good enough to get you started. ie SELECT FROM, INSERT, CREATE TABLE, WHERE, AVG, MIN, etc. very simple. The questions were not too hard tho
  • BIGQUERY is trash in terms of using, by they have good real world datasets
R: I am a R fanboy due to me using it in my economic backgrounds
  • Very short compared to how I actually use it
  • Many people were confused from it on how she presented it.
  • I didn't like the visualization part of it. I myself only use visualization in R to see visual what I need to code. A good non "R" looking visualization takes too much code.
  • Not really don't well from my opinion as well
Tableau: Used for visualizations
  • Not done well
  • In my experience the hardest to learn quickly.
  • SPREADSHEETS are the most important thing to learn, which I don't believed was covered.
  • Even other parts are not well done as well . Example: One of the case studies uses location data. This course does not really go in depth on map graph. So without outside searching, you will be stuck
  • Best to learn outside of course
Spreadsheet: Well done
  • A Data analyst who can't use excel is a joke
  • Give a good amount of knowledge in terms daily operation, formulas, and etc
  • This is the best of the course

They also cover some aspects, like thinking in a data way, thinking in a critical way, making sure you know how to act in a job (ie emails, dress, etc), job search and interview tips, career development tips, etc. Probably the most valuable from the cert. Its the biggest value over the similar IBM cert.


Job opportunities: Close to none. Google states companies are willing to hire post cert. This is a lie, even the Secret job board Google has is relatively poor. The job board does not have a lot of remote opps (I believe less than 20 off top). If you don't care moving, you could prob get a job, however, most opps are in High Living areas however.

Companies in general are not aware of this cert as well, and does not hold real cert weight compared to lets say C++ or COMPTIA++ or even a Tableau Cert
  • To this point, in the data field, projects>certs. And Experience>projects. Make sure do the capstone project and other projects very soon.
  • To the first point, get a Github, LinkedIn, and a portfolio website

Personally, post cert, I have went from a 15$ data job to just this month of May accepting a job just under 70k. The google cert DID NOT HELP DIRECTLY (you will see many people call it useless) however indirectly, it gave me a lot terminology that I could use on interviews. Please keep in mind when take the cert. And questions, ask me.
I started that shyt got about 90% through then realized it was garbage.


I mean it’s good for someone who knows absolutely nothing….


But it doesn’t go into any type of detail enough for a job.


I’d recommend data camp or data365 over this cert.
 

greenvale

Superstar
Supporter
Joined
Aug 1, 2017
Messages
6,353
Reputation
1,960
Daps
24,777
Reppin
Delaware
I'd recommend playing around with open data sets from cities if you're trying to get better at data manipulating with Python, data visualization or just trying to work with different types of data for practice.

For example Philly has this one Datasets - OpenDataPhilly.

Good practice and you might learn something about your city :manny:
 
Top