We’re #hiring a new Senior SQL Developer in Alpharetta, Georgia. Apply today or share this post with your network.
Broad Reach Partners’ Post
More Relevant Posts
-
1. "How many backup admins does it take to screw in a lightbulb? None, they'd just restore from a previous save point." 2. Why don’t we let database administrators join our band? Because they always want to do a “join” on every “table”! 😄 3. "What do you call a big data analyst who's lost their job? Unemployed, with a huge dataset of skills." 4. "Why did the DBA cross the road? To get to the non-nullable side!" 5. Why did the storage admin get promoted? ...Because they could finally explain the difference between a terabyte and a thunderbolt! 6. What's a manager's favorite motivational quote? 'The buck stops... one desk over.'" 7. "Why was the computer cold at the office? Because it left its Windows open! 8. "Why did the software developer go broke? Because he used up all his cache!" #DebuggingLife #CodingHumor #CoffeeAndCode #ProgrammersBeLike #JustITThings #TechPuns #OfficeAntics #DataDrama #humor #IThumor #Jokes
To view or add a comment, sign in
-
Another stupid recruiter? I have an urgent requirement with one of my clients, details given below. If you find your self suitable for the position, please send me your latest updated resume along with contact details. Please include your employer details as well. Job Title Sr. Data Modeler Project Location San Jose, CA Duration 12 months /Contract Skills Required and Job Description: Objective: This Resource will support data engineer and data analysis work for the Integrated Justice System and BI Platform Modernization projects. Rate: $45/hour ———-RESPONSE———- That job cannot be done onsite. Justice system databases usually have over 10,000 tables due to massive legal requirements. The hardware required to create that costs $8,000,000. It won’t fit into a cubicle. Minimum data modeling budget is $30,000,000 for 36 months, NOT 12, with an additional 20 million for everything else - business analysts, developers, unit tests, performance test, legal review test, documentation, encryption, cybersecurity, etc. Nobody can do any of that for $45/hour. I could make that driving a truck or delivering pizza. Additionally 99% of government buildings are unsafe due to mold growth and sick building syndrome. I would lose any staff I send onsite after 90 days due to illness and permanent lung disabilities. Lastly, A new report from real estate giant Zillow shows the average mortgage in San Jose is now $9,136 a month, including homeowners insurance and taxes. The minimum income required to live there is $439,000. Your job rate at $90,000 is 80% below a living wage in San Jose and 99.7% below the required budget to build a justice system data model. You are requesting that I do a three year 30 million dollar job in one year with 0.3% of the required budget. https://lnkd.in/gNgxY4Ax
To view or add a comment, sign in
-
🌟 SQL Challenge: Deleting Duplicate Records Using Window Functions 🌟 Hi LinkedIn Community, I've been working on optimizing data management tasks and recently tackled the challenge of deleting duplicate records from a table using SQL window functions. Here’s a clean and efficient way to do it: WITH CTE AS ( SELECT emp_id, name, email, ROW_NUMBER() OVER (PARTITION BY email ORDERBY emp_id) AS row_num FROM employees ) DELETE FROM employees WHERE emp_id IN ( SELECT emp_id FROM CTE WHERE row_num >1 ); I'm passionate about SQL and data optimization, and I enjoy solving such challenges. If you’re a recruiter looking for someone with advanced SQL skills and a keen eye for data quality, let’s connect! 🚀 #SQL #DataAnalytics #DatabaseManagement #DataIntegrity #TechCareers #JobSearch #SQLServer #DataEngineer #Hiring #Recruitment #CareerOpportunities #DataOptimization
To view or add a comment, sign in
-
Hello Everyone, Greetings of the day! We are #hiring for the below position. Sr. SQL Developer (In-person interview) Minneapolis, MN Please share the resume with LinkedIn in that email (shaurya.garg@xchangesoft.net) #SQLDeveloper #SQLProgramming #SQLQuery #DatabaseDeveloper #SQLCoding #SQLServer #DatabaseManagement #SQLExpert #SQLSkills #SQLDesign #SQLData #SQLDeveloperLife #SQLPerformance #TSQL #SQLTips #SQLQueries #SQLDevelopment #SQLOptimization #SQLScripts #SQLSolutions
To view or add a comment, sign in
-
Unlock the secrets to hiring and retaining top-tier Data Warehouse Developers! With a staggering 21% job growth rate projected by 2028, the demand for these tech wizards is soaring. Dive into our blog to discover how to navigate this talent surge and keep your data game strong! https://buff.ly/3TRoxbA
To view or add a comment, sign in
-
-
My talented friend is a skilled SQL Developer actively seeking new opportunities! With 2 years of experience in SQL And DataBase Management System, she's eager to bring her expertise to a new role. Please reach out if your team is looking for a dedicated professional to join your organization. Let's connect and discuss how she can contribute to your team's success! #DatabaseEngineer #JobSearch #LinkedInJobPost #sqldeveloper #databasemanagementsystem #sql #dbms #structurequerylangauge #sqljobs
To view or add a comment, sign in
-
Hello Connections!!! 🔍 Just completed an insightful project on Data Cleaning titled "World Layoffs - MySQL Tutorial" inspired by Alex the Analyst's YouTube channel! Here are some key takeaways from the journey: 🧹 **Essential Steps in Data Cleaning:** - Importing raw data with careful consideration of data format. - Copying data to a staging table for manipulation. 📅 **Dealing with Date Columns:** - Imported data with date columns, ensuring proper formatting. - Utilized functions to adjust date column formats. 🔍 **Identifying and Handling Duplicates:** - Leveraged row numbers and partitioning to identify and manage duplicates. - Explored strategies specific to MySQL for duplicate removal. 🛠️ **Creating and Managing Tables:** - Created new tables and managed data types efficiently. 🔄 **Updating and Cleaning Data:** - Adjusted settings for data updates and standardized labels for accurate analysis. 🌍 **Country-Specific Data Cleaning:** - Applied specific cleaning techniques tailored to country-specific data. ⚙️ **Advanced Operations:** - Explored converting data types, updating based on conditions, and joining tables. 🚮 **Removing Unwanted Data:** - Removed unnecessary columns and rows to streamline data. 💡 **Final Thoughts:** - Data cleaning is not just about tidying up; it's about discovering new insights. - This project is a valuable addition to any portfolio due to its depth and quality. Excited to apply these learnings to future projects! Thanks, Alex the Analyst, for the invaluable lessons. 🌟 #DataCleaning #MySQLTutorial #DataAnalysis #LearningJourney
To view or add a comment, sign in
-
Things I can finally, confidently do in SQL: ◾ identify and use the most common SQL clauses ◾ implement operations ◾ implement aggregations ◾ identify the different types of joins and how/when to use them ◾ identify common data types ◾ create and manipulate tables in VS Code with PostgreSQL Please accept this silly SQL quote: "My relationship with databases just went from 'it's complicated' to 'we speak the same language'." Now, who's hiring? Happy Pride! 🌈 #opentowork #dataanalytics #dataanalyst #datanerd #data #sql #vscode #postgresql #coding #newskills
To view or add a comment, sign in
-
🎉 I recently completed an **advanced data analysis project** where I analyzed a dataset of company layoffs using **MySQL**. The project involved **cleaning the data**, running various **SQL queries** to extract intermediate to advanced insights findings. **Key Highlights:** - Data Cleaning: Efficiently **cleaned and prepared the data** for analysis directly in MySQL, handling **duplicates**, **missing values**, and **standardizing formats**. - Advanced Analysis: Answered complex questions about the **number and patterns of layoffs** across different companies, industries, and regions. - Technical Skills: Enhanced my skills in **SQL**, **data analysis**. **Project Link:** [Layoffs Insight Analysis](https://lnkd.in/gJSXpf5W) Additional Data Cleaning Steps in MySQL: - Ensuring Referential Integrity: Verified that all **foreign key relationships** are intact and consistent. - Data Type Conversion: Converted **data types** to ensure consistency and optimal storage. - Handling Outliers: Identified and handled **outliers** to ensure accurate analysis. - Normalization: **Normalized** textual data fields to reduce redundancy and improve query performance. - Validation: Implemented comprehensive **data validation checks** to ensure data accuracy and integrity.. Special thanks to #AlexFreberg for his invaluable guidance and support in this project. #DataAnalysis #SQL #MySQL #DataCleaning #DataVisualization #AdvancedAnalytics #DataInsights #TechSkills #DataProject #BigData #BusinessIntelligence #DataDriven #Analytics #AlexFreberg #GitHubProjects #DataCommunity
To view or add a comment, sign in
-
Hi Connections, We have an a consultant DATA ENGINEER 12+ Years Maryland \ Remote \ Hybrid in USA. Please let me know if have any Requirements on c2c. #dataengineer #technologies #networking #java #programming #python #data #engineer #w2 #c2c #network #cooding #dataanalyst #deepleaning #aiv #sql #html #maryland #remote #hybrid #USA #sql #cloud #nosql #etltools #datamodeling #analyzingdata #data #USA #remote #hybrid
To view or add a comment, sign in
-