Snowflake Cloud Datawarehouse | ETL | SQL Learning/ Upskilling
824 subscribers
5 photos
15 links
DM @Snowflake_Guide
This channel is primarily for people who want to upskill to Snowflake Cloud Dataware House/ETL.
Download Telegram
Everyone - pls reach out if you have any questions related to next snowflake cloud batch? Hurry up time is running out.
Invest in yourself and upgrade your skill sets.
Final reminder !!!
Snowflake Cloud - Data engineering
Batch starting from 31st Aug.
Enrollments will close by 7pm ist tmrw
Interested folks can DM and register
We offer part payment option as well
Note - enrollments for Snowflake cloud batch starting tomorrow will close by 7 pm ist.. hurry up if you want to enroll..
Thank you for joining our 1st session on Snowflake Cloud new batch..
Have shared recording, notes, codes, session notes for all topics covered today..
Pls check and inform if it's not visible
Which type of table corresponds to a single Snowflake session?
Anonymous Quiz
75%
Temporary
16%
Transient
6%
Provisional
3%
Permanent
👍2
Week 1 completed successfully for new batch...
Everyone - based on above polls, we will come up with Free webinars /sessions on some specific topics..pls cast your vote
Kindly cast your votes so we can plan some relevant free webinars for you guys :)
The object cannot be recovered by time travel
Anonymous Poll
17%
Tables
16%
Database
52%
Stage
15%
Schema
👍5
Star schema is composed of how many fact tables
Anonymous Poll
52%
One
11%
Two
5%
Zero
32%
Any number of facts allowed
👍1
𝐒𝐐𝐋 𝐐𝐮𝐞𝐫𝐲 𝐎𝐩𝐭𝐢𝐦𝐢𝐳𝐚𝐭𝐢𝐨𝐧 𝐏𝐫𝐨 𝐓𝐢𝐩𝐬! Are your SQL queries experiencing a slowdown? It's time to ramp up your database performance with these 20 expert tips! 💡
1: Preferably, use UNION ALL for faster query results unless removing duplicates is essential.
2: Avoid using OR in join conditions to prevent substantial impacts on query speed.
3: Use UNION instead of UNION ALL when eliminating duplicates from combined queries.
4: Minimize network latency by performing computations or filtering closer to the database using stored procedures.
5: Regularly update statistics and indexes on tables to ensure optimal query execution plans. 6: Simplify and improve code readability by restructuring subqueries for better efficiency. 7: Minimize using functions on the right-hand side of operators for smoother query execution.
8: Employ EXISTS rather than DISTINCT in one-to-many table joins to boost query performance.
9: Specify individual column names instead of using '*' in SELECT statements for improved accuracy and speed. 10: Optimize JOIN order by arranging joins based on the smallest result set first for faster processing.
11: Streamline queries by eliminating redundant mathematical operations, promoting a simpler and faster approach.
12: Leverage the IN predicate when working with indexed columns for enhanced query efficiency.
13: Break down complex queries into smaller, more manageable parts for improved execution time.
14: Implement query caching to store frequently executed queries for faster retrieval.
15: Profile query performance using tools like EXPLAIN or query execution plans to identify bottlenecks.
16: Avoid unnecessary nesting or chaining of views to reduce query complexity and improve performance.
17: Use the HAVING clause judiciously within SELECT statements to prevent potential query slowdowns.
18: Consider selective database denormalization to reduce JOINs and boost query speed for specific operations.
19: Reduce the use of DISTINCT conditions to only necessary instances for quicker query performance.
20: Regularly review and fine-tune your queries for enhanced efficiency and optimal performance.

Share with ur friends/colleagues whom you think this might be useful
#Learning
5👍3
Snowflake Cloud Datawarehouse | ETL | SQL Learning/ Upskilling
Star schema is composed of how many fact tables
Answer is one... Star schema has one fact with N number of dimensions around it.. Snowflake schema you have fact table with dimensions around with dimensions connected to another dimensions
👍3
Snowflake playing nicely with other data lakes:
At this time Snowflake supports three external tables or external stages: AWS S3, Azure Blob, and GCP Cloud Storage.
You can use the "External Table" type to point to a data lake that is not housed in the Snowflake environment but allows you to perform compute read-only in the Snowflake environment.
When using Snowpipe to load data into Snowflake you can use serverless computing, which reduces cost as data warehouses are not required.
1👍1
Wish everyone a very Happy and prosperous Diwali
4
How many would you like to join webinar on below ? ( 2 hrs session)
Anonymous Poll
38%
Datawarehouse Concepts
47%
Snowflake Cloud - introduction
15%
SQL Basics
👍5