Uncategorized – jack of all trades master of some https://jackofalltradesmasterofsome.com/blog Consultant - Real Estate - Author - Business Intelligence Fri, 17 Mar 2023 18:36:38 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 Decrypt and Save a Text or CVS with PGPY and Python https://jackofalltradesmasterofsome.com/blog/2023/03/17/decrypt-and-save-a-text-or-cvs-with-pgpy-and-python/ Fri, 17 Mar 2023 18:36:18 +0000 https://jackofalltradesmasterofsome.com/blog/?p=1321 Hopefully this helps someone to Decrypt and Save a Text or CVS file with PGPY and Python. Use Kleopatra and  OpenPGP to generate and save your key. You can obtain it by Exporting “Backup Secret Keys”.

Then run this code to decrypt and save the file in whatever location you need.

import pgpy

emsg = pgpy.PGPMessage.from_file('YOUR FILE.pgp')
key,_  = pgpy.PGPKey.from_file('YOUR PRIVATE KEY.asc')
with key.unlock('YOUR PGP PASSWORD'):
    print (key.decrypt(emsg).message)
    with open("my_file.txt", "wb") as binary_file:
   
        # Write bytes to file
        binary_file.write(key.decrypt(emsg).message)

That’s it! Keeping it short and simple.

]]>
Best Science-Themed Gifts: Get Your Nerd On https://jackofalltradesmasterofsome.com/blog/2022/12/20/best-science-themed-gifts-get-your-nerd-on/ Tue, 20 Dec 2022 00:21:37 +0000 https://jackofalltradesmasterofsome.com/blog/?p=1272 Are you struggling to find the perfect gift for the scientist in your life? Look no further! We’ve compiled a list of the best creative gift ideas that are sure to impress the science-loving person in your life. Here are our top 5 Best Science-Themed Gifts.

A periodic table throw blanket

  • This cozy blanket is not only functional, but it’s also educational! The periodic table of elements is featured prominently, making it the perfect gift for the chemistry enthusiast in your life
Best Science-Themed Gifts blanket
Periodic Table Throw Blanket

Custom Handmade Patent

  • Has your favorite scientist been awarded a patent. Help them show off their hard work with a custom patent designed for them to hang in their office or home.
Best Science-Themed Gifts patent
Custom Handmade Patent on Etsy

A science-themed cookbook:

  • If your scientist friend loves to cook, consider gifting them a cookbook filled with recipes that incorporate scientific principles. From molecular gastronomy to edible chemistry experiments, there are plenty of options to choose from.
Best Science-Themed Gifts books
The Food Lab: Better Home Cooking Through Science

A DIY terrarium kit

  • For the botanist or plant lover in your life, consider giving a DIY terrarium kit. This gift allows them to create their own miniature ecosystem and learn about plant care in the process.
Best Science-Themed Gifts
DIY Terrarium Kit

A Science-Themed Gifts board game

  • Board games make for a great gift, and there are plenty of options available with a science twist. From chemistry-themed games like “Elements” to biology-themed games like “Pandemic,” there’s something for every science enthusiast.
Best Science-Themed Gifts board game
Pandemic Board Game

5. Science fiction novel

  • Give the gift of a good read with a science fiction novel. From classic works by authors like Isaac Asimov to more modern offerings, there are plenty of options to choose from.

No matter what gift you choose, your scientist friend is sure to appreciate the thought and creativity you put into it. Happy shopping! Best Science-Themed Gifts

]]>
Get Data from NetSuite using Azure Data Factory https://jackofalltradesmasterofsome.com/blog/2022/05/17/get-data-from-netsuite-using-azure-data-factory/ Tue, 17 May 2022 16:25:29 +0000 https://jackofalltradesmasterofsome.com/blog/?p=1263 Get Data from NetSuite using Azure Data Factory. As Netsuite is an oracle product, getting data into your data warehouse can be a little tricky. Here is a helpful guide to get started.

Side Note: Want to learn SQL or Python for free. In less then 10 minutes a day and less than an hour total? Signup for my free classes delivered daily right to your email inbox for free!

Back to the article…

Helpful Links

https://docs.oracle.com/en/cloud/saas/netsuite/ns-online-help/section_3994744300.html

https://www.netsuite.com/help/helpcenter/en_US/srbrowser/Browser2020_1/analytics/record/transactionLine.html

1.      Setup VM

Virtual Machine will be required since a server with the ability to install the correct ODBC drivers and a SQL Server to create the linked services will be needed. This will allow you to install an Integration runtime for Azure Data Factory to connect to and leverage the Linked Server

  1. Items to install on VM
    1. SQL Express Local DB
      1. Needed to test and verify ODBC connection to NetSuite and created Linked Server Connection
      1. SQL Server Express LocalDB – SQL Server | Microsoft Docs
      1. Install the SQL Server as a Mixed Authentication Mode
        1. Create a User on SQL Server that has rights to all databases and linked servers needed in the final steps
    1. SQL Server Management Studio
      1. Needed to test and verify Linked Server ODBC connection to NetSuite
    1. Azure Data Factory Integration Runtime
      1. Needed to connect a VM to Data Factory
      1. Download Microsoft Integration Runtime from Official Microsoft Download Center
    1. NetSuite ODBC Driver for Windows
      1. See next steps for download and setup instructions

2.      Setup the NetSuite ODBC driver for Windows

You will need to install the NetSuite ODBC drivers on the VM so that it can communicate with the NetSuite Servers

  1. This will be provided by your Admin or Via the NetSuite Portal
    1. You will need to obtain the server host, service port, service data source, account ID and Role ID from NetSuite as well as configure a user that has access to these settings.
  2. NetSuiteODBCDrivers_Windows64bit
  3. Navigate to your ODBC Settings in Windows and Add a new System DSN. If everything installed correctly you will now see a “NetSuite Drivers 64bit” to select from.
  4. In the Settings for this page, enter the Account ID, and Role ID from the setup instructions from Netsuite
  5. Once you select “Test Connect” you will be prompted to enter your credentials you configured in NetSuite.

3.      Setup a Linked Server Connection

With SQL Server Express and SQL Server Management Studio Setup, you can now create a Linked Server Connection to NetSuite.

  1. In the General tab, enter the following information:
    1. Linked server: NETSUITE (Name which you want to appear in the object explorer)
    1. Provider: Microsoft OLE DB Provider for ODBC Drivers
    1. Product name: NetSuite.com
    1. Data Source: The name of you 64-Bit ODBC Data Source from Previous Step. This was “NetSuite” in this example.
    1. Provider string: DSN=NetSuite.com
  • In the Security tab, select “Be made using this security context:”. For remote login, use your NetSuite login and password.
  • The linked server should now appear in the object explorer.
    • you can now query the tables with queries similar to “Select * from [NETSUITE].[Database].[Administrator].[ACCOUNT_ACTIVITY]”

4.      Set Up Integration Runtime in Azure Data Factory

Now that a linked server has been set up from your VM to the NetSuite box, you can now use this connection in Azure Data Factory. Since this is considered a on-premise server, you will need to install an Azure Integrated Runtime so that Azure Data Factory which is on the cloud can find and communicate with this VM.

5.      Setup and Install Integration Runtime on VM

You now will need to install the Integration Runtime on the VM and configure it so that it can find and communicate with the Azure Data Factory Service you configured in previous step

If all was done correctly, you will now see a valid connection inside Azure Data Factory

6.      Azure Data Factory Setup

Everything should now be setup and to move data from your NetSuite to your destination in Azure Data Factory

a.       Setup a Linked Service in Data Factory

b.      Create a Data Source

c.       Create a Pipeline and Move Data

]]>
Use Azure Blob Storage for SFTP https://jackofalltradesmasterofsome.com/blog/2022/03/08/use-azure-blob-storage-for-sftp/ Tue, 08 Mar 2022 15:55:29 +0000 http://jackofalltradesmasterofsome.com/blog/?p=1244 This feature is in preview at time of writing and may be subject to change. Here is a quick guide on how to Use Azure Blob Storage Gen2 for SFTP for cheap serverless SFTP. Step by step guide on setup and access. By using local users, this allows you to create a series of SSH users and passwords for segmented folders and control access to your SFTP file storage.

Use Azure Blob Storage for SFTP

  1. Navigate to Subscriptions and select the subscription you want to activate the SFTP features against
  • Select “Preview Features” and then search for “SFTP”. Select “SFTP support in Azure Blob Storage and click Register.
  • Create a new Gen2 storage account in Azure. Use any name you need and set as Standard.
  • Under advanced setting you must Enable hierarchical namespace in the storage settings as well as Enable SFTP.

Setup SFTP in Azure Blob Storage

  1. In your storage, create a new container.
  • Under Settings you now should have a SFTP option to navigate too.
  • Create a new local user and select “SSH Password”
  • Next give this user the correct permissions to the containers and folders. Once completed the password will be shared with you.

Connect to Azure SFTP in Cleint

  1. Open a SFTP tool such as WinSCP or FileZilla
  2. For the host name enter the blob storage name, and user name and password from the previous steps.
    1. storagename.blob.core.windows.net
  3. Hit connect. You may be prompted for approval of a key. Once you accept, you should now have access to the SFTP on blob storage!

For more information be sure to check out the rest of this blog for azure tips and tricks and business intelligence tutorials. Azure belongs to the Microsoft suite of products and for more details please see the portal site on Microsoft’s documentation.

]]>
Connect to Azure SQL using Active Directory and Grant Access outside Organization https://jackofalltradesmasterofsome.com/blog/2022/02/13/connect-to-azure-sql-using-active-directory-and-grant-access-outside-organization/ Sun, 13 Feb 2022 20:08:58 +0000 http://jackofalltradesmasterofsome.com/blog/?p=1235 Learn how to Connect to Azure SQL using Active Directory and Grant Access outside users outside of your Organization. This comes in handy when you have a user that sits outside of your organization and they need to log into a SQL environment you have provisioned for them. They will need to be invited as a guest user and then the appropriate access setup so they can login with out needed a hard coded SQL authentication login which creates risk.

1. Allowing Active Directory to Authenticate to SQL Server

  • Navigate to your server and Click on “Not Configured” for Active Directory admin inside the Server you are interested in allowing AD access to.
  • Set Admin and set your Azure Portal account as admin.
  • Leave the check box for Azure Active Directoy authentication only set to unchecked. This way you can continue to use defined crecentianls when needed.
  • Be sure to click Save to save changes.
  • Head back to SSMS and log in using Azure Active Directory – Universal with MFA or use the correct setting approved by your administrator.

2. Invite User to Your Organization

  • Add a new Guest User from the User section in your Azure Portal
  • User will get an email to activate their account

3. Adding a External AD user to your Database

  • Create a new Query window and run the following commands. and edit the user name and business to the users created in step 2

CREATE USER [username.com#EXT#@business.onmicrosoft.com] FROM EXTERNAL PROVIDER;

ALTER ROLE [db_datareader] ADD MEMBER [username.com#EXT#@business.onmicrosoft.com];

4. Conclusion

And that is all you need to Connect to Azure SQL using Active Directory and Grant Access outside Organization.

]]>
Best Men’s Fashion Blog Links https://jackofalltradesmasterofsome.com/blog/2021/12/26/beset-mens-fashion-blog-links/ Sun, 26 Dec 2021 19:40:27 +0000 http://jackofalltradesmasterofsome.com/blog/?p=1223
  • Cheap Business Casual Capsule Wardrobe for Men
  • The Capsule Wardrobe Guide for men 2022
  • Best Male Fashion Watches on Amazon Under $50
  • The Guide on How to Care for Men’s Sweaters
  • The Basic Bastard Wardrobe Guide
  • Basic Layering Tips for Men
  • Affordable Wedding Bands For Men: The Best Cheap Wedding Bands
  • Ideal Men’s Haircut for Every Face Type
  • Beset Men’s Fashion Blog Links

    ]]>
    Visual Studio Database projects to Deploy Azure Synapse Pool https://jackofalltradesmasterofsome.com/blog/2021/02/15/visual-studio-database-projects-to-deploy-azure-synapse-pool/ Mon, 15 Feb 2021 20:36:56 +0000 http://jackofalltradesmasterofsome.com/blog/?p=1129 Visual Studio Database projects to Deploy Azure Synapse Pool

    Side Note: Want to learn SQL or Python for free. In less then 10 minutes a day and less than an hour total? Signup for my free classes delivered daily right to your email inbox for free!

    Now back to the article…

    Get Visual Studio 2019

    1. Download and install Visual Studio 2019 Community Edition
      1. https://visualstudio.microsoft.com/

    b) Verify Data Storage and Processing and make sure all updates are up to date.

    Create the database project

    1. Create a new project in Visual Studio
    • Create a new SQL Server Database project.
    • To add your first item, select from the new solution and select Add->New Item
    • From the list of items, select “Table (Data Warehouse) as this will allow for slightly different create table statements with columnstore indexes.
    • Add your code to the editor. Some items may still show an error but it will not be an issue. Save when ready.

    Update the Target Platform

    1. Right click the solution and select properties
    • Set the Target Platform to Microsoft Azure SQL Data Warehouse and save

    Publishing Changes to Server

    1. Right click the project in solutions explorer and select “Publish”
    2. Select the Azure SQL Data warehouse as the target database platform and select “Publish”

    Visual Studio Database projects to Deploy Azure Synapse Pool

    ]]>
    Automatically pausing and resuming Azure Workspace Synapse Pool Using Azure Data Factory https://jackofalltradesmasterofsome.com/blog/2021/02/10/automatically-pausing-and-resuming-azure-workspace-synapse-pool-using-azure-data-factory/ Wed, 10 Feb 2021 20:59:10 +0000 http://jackofalltradesmasterofsome.com/blog/?p=1100 Automatically pausing and resuming Azure Workspace Synapse Pool Using Azure Data Factory.

    1. Create a new Azure Data Factory Pipeline
    2. Add the Web Task and name it “PauseDW”
    • Set the setting to “POST”
    • In the advanced section, set the method to “MSI” and the resource to “https://management.core.windows.net”
    • In your IAM Security for your workspace add the contributor role for your data factory.
    • Debug your pipeline and your service should now pause. Update the API call to the replace the word “pause” with “resume” to have it work the other way around.
    • Add these steps to triggers at specific times a day to run to turn your resources on and off.

    Automatically pausing and resuming Azure Workspace Synapse Pool Using Azure Data Factory.

    ]]>
    Looping SQL Tables to Data Lake in Azure Data Factory https://jackofalltradesmasterofsome.com/blog/2021/02/08/looping-sql-tables-to-data-lake-in-azure-data-factory/ Mon, 08 Feb 2021 02:31:46 +0000 http://jackofalltradesmasterofsome.com/blog/?p=1085 When you load data from a SQL Server, instead of individual pipelines, it is best to have one dynamic table controlled process. Learn how to loop through SQL tables dynamically to load from SQL Server to Azure Data Lake. Looping SQL Tables to Data Lake in Azure Data Factory

    Setting up the ETL_Control Database

    Create the database ETLControl and the table to store the metadata for the ETL runs.

    USE [ETLControl]
    GO
    
    SET ANSI_NULLS ON
    GO
    
    SET QUOTED_IDENTIFIER ON
    GO
    
    CREATE TABLE [dbo].[ETLControl](
    	[Id] [int] IDENTITY(1,1) NOT NULL,
    	[DatabaseName] [varchar](50) NOT NULL,
    [SchemaName] [varchar](50) NOT NULL,
    	[TableName] [varchar](50) NOT NULL,
    	[LoadType] [varchar](50) NOT NULL
    ) ON [PRIMARY]
    GO
    
    
    Insert Into [dbo].[ETLControl]
    Select 'Databasename', 'dbo', 'TableName1', 'Full'
    
    Insert Into [dbo].[ETLControl]
    Select Dataasename', 'dbo', 'TableName2', 'Full'
    

    Setting up Azure Data Factory

    1. Create a Linked Service to the SQL Database
    • Create a DataSet for the ETLControl Database
      • Point to the Linked Service for SQL Server
      • Do no assign it a table name. This will be done dynamically later.
    • Add a new Pipeline with the Lookup object
      • Set the source Query  “Select * From ETLControl”
    • Add the For Each Loop
      • In the settings add the Dyanmic Item “@activity(‘Get-Tables’).output.value”
    • Add a new data source for the SQL Souce
      • Give the SQL a parameter for TableName and SchemaName
      • Update the Table to use the variables
    • Add a new data source for the DataLake Destination
      • Give the SQL a parameter for FileName
      • Update the File Path to the Dynamic content parameter
    • Add a Copy Activity to the For Each Loop.
      • Set the source using the variables from the Lookup
      • Set the sink as the file name variable from look up with .csv
    • Debug to run to see new files land in Data Lake with dynamic names. There should be one file for each table that was loaded. You can modify the file names to include folder names and more dynamic storage if needed.

    Looping SQL Tables to Data Lake in Azure Data Factory

    ]]>
    Streaming ETL with Azure Data Factory and CDC – Create a Parameter Driver Pipeline https://jackofalltradesmasterofsome.com/blog/2021/01/27/streaming-etl-with-azure-data-factory-and-create-a-parameter-driver-pipeline/ Wed, 27 Jan 2021 01:27:40 +0000 http://jackofalltradesmasterofsome.com/blog/?p=1023 In this series we look at building a Streaming ETL with Azure Data Factory and CDC – Create a Parameter Driver Pipeline. This is Part 6, The rest of the series is below.

    1. Enabling CDC
    2. Setting up Audit Tables
    3. Provisioning Azure Data Factory
    4. Provisioning Azure Blog Storage
    5. Create Data Source Connection in ADF
    6. Create Incremental Pipeline in ADF
    7. Create a Parameter Driven Pipeline
    8. Create a Rolling Trigger

    This series uses the Adventureworks database. For more information on how to get that set up see my Youtube video for Downloading and Restoring the database.

    The previous step will pull all the changes in the CDC table, but we do not want to do this all the time. So let’s look at creating a rolling window for the CDC ETL.

    1. Navigate to the parameters section and create a new parameter. Add two paramenters “triggerStartTime” and triggerEndTime” and set them to yesterday and todays date in the format “2020-01-07 12:00:00:000”
    • On the Lookup Activity, update the query in the settings to the following to use the new variables. SQL Agent must be running for this step the parameters must be valid dates.

    @concat(‘DECLARE @begin_time datetime, @end_time datetime, @from_lsn binary(10), @to_lsn binary(10);

    SET @begin_time = ”’,pipeline().parameters.triggerStartTime,”’;

    SET @end_time = ”’,pipeline().parameters.triggerEndTime,”’;

    SET @from_lsn = sys.fn_cdc_map_time_to_lsn(”smallest greater than or equal”, @begin_time);

    SET @to_lsn = sys.fn_cdc_map_time_to_lsn(”largest less than”, @end_time);

    SELECT count(1) changecount FROM cdc.fn_cdc_get_all_changes_dbo_DimProduct (@from_lsn, @to_lsn, ”all”)’)

    • Navigate back to the “True” condition and paste the following query in to track the changes with the variables as well

    @concat(‘DECLARE @begin_time datetime, @end_time datetime, @from_lsn binary(10), @to_lsn binary(10);

    SET @begin_time = ”’,pipeline().parameters.triggerStartTime,”’;

    SET @end_time = ”’,pipeline().parameters.triggerEndTime,”’;

    SET @from_lsn = sys.fn_cdc_map_time_to_lsn(”smallest greater than or equal”, @begin_time);

    SET @to_lsn = sys.fn_cdc_map_time_to_lsn(”largest less than”, @end_time);

    SELECT * FROM cdc.fn_cdc_get_all_changes_dbo_DimProduct(@from_lsn, @to_lsn, ”all”)’)

    • Edit the Sink tab in the true statement and click on parameters.
    • Add a new parameter called triggerStart
    • Head back to the Connections Tab for the dataset where we will be adding dynamic content for the directory and file.
    • Add the following for the directory and file sections.

    Directory

    @concat(‘dimProduct/incremental/’,formatDateTime(dataset().triggerStart,’yyyy/MM/dd’))

    File

    @concat(formatDateTime(dataset().triggerStart,’yyyyMMddHHmmssf

    ff’),’.csv’)

    • Navigate back to the Sink in the Copy and expand dataset properties. Add the dynamic content for the new parameter.
    1. You can now trigger your run and see the new files landing in the datalake.

    Streaming ETL with Azure Data Factory and CDC – Create a Parameter Driver Pipeline

    ]]>