Many System Administrators and DevOps engineers start with this simple approach.
Being the sole user of version control they feeli satisfied with just having their code versioned.
The problem arises the day another team member joins or (what’s more likely) you join a larger team. This is important because larger teams require more structure and coordination to work effectively with version control, and will most likely require this from you before you get to work.
When it comes to GitHub; you could argue that the Pull Request is GitHub’s greatest feature (maybe GitHub actions are the second greatest feature hehe).
Let’s look at a simple workflow that utilizes this feature.
It’s designed to promote collaboration and has features that promotes DevOps:
This is crucial - you need to make sure your local copy, your fork, and the original repository are all on the same page.
Always make sure your fork is up to date before starting new work. You’ll minimize merge conflicts this way. This should be something you think about every time you open your IDE.
git checkout main
git pull origin main # Get changes from your forkgit pull upstream main # Get changes from original repo
The idea being a small daily exercise to help you get used to the workflow and build memory through repetition.
In this case, the workflow I’ve described above can be used as a daily practice:
Create a new branch
Make a small change
Create a PR
Get it reviewed and merged, or removed by yourself if it’s not needed and is only for learning purposes etc
Clean up
This is a safe, simple and powerful way to get used to the workflow and build memory through repetition, which is one of my favorite ways to learn something new!
GitHub CLI is a command line tool that allows you to interact with GitHub from the terminal.
Learning and using the mentioned workflow, the GitHub CLI and utilizing auto-merge can automate this. It wont be as simple as pusing directly to main, but it’ll be a good compromise and sooner or later you’ll have to learn it.
I’m using sway on arch (btw) and naturally I need a terminal GUI app launcher. I was quite surprised how well it works, prerequirements are fzf and flatpak:
flatpak run (flatpak list --app--columns=application |fzf)> /dev/null 2>&1&
As mentioned, I version control my custom shell functions.
You can version control your ~/.config/fish/functions folder (for fish) or your ~/.bashrc (for bash) using Git and GitHub, it’s possible with bash and zsh too, but you have to set it up yourself if you want the same workflow that is.
When I use bash I usually version control my ~/.bashrc as a github gist, but I’m not a fan of it, because it’s not as modular as self-contained functions in a repo.
That said, a gist is great because you can use the github CLI to edit or create a gist from the terminal!
# create a gistgh gist create --source=. --remote=origin --push
# list gistsgh gist list
# edit a gistgh gist edit <gist-id> --add ~/.config/fish/functions/mycoolfunc.fish
User submit requests through ServiceNow for account access
The Catalog Item (ServiceNow request form) is not that great and tricky to use
Approvals may take days
Once approved, the DevOps pipeline is triggered, but because of hardship with creating sensible API calls between an ITSM system and a DevOps tool, it’s harder to sanitize the data, making the run more error prone
The user is presented with 200 lines of different logs after days of waiting
ITSM tools and DevOps tools solve different problems. ServiceNow is good at managing access and compliance workflows. GitHub is industry leading at technical collaboration and automation.
This approach uses both tools for what they do well. The API integration keeps ServiceNow updated while letting technical teams work efficiently.
Have fun, now you only have about 25,000 lines of code to write!
The GitHub CLI (gh) makes working with forks and clones even easier:
# Fork and clone in one commandgh repo fork user/repo --clone=true# Just fork (no clone)gh repo fork user/repo
# Clone your existing forkgh repo clone YOUR-USERNAME/repo
# Create PR from your forkgh pr create --base main --head YOUR-USERNAME:feature-branch
Pro tip: gh repo fork automatically sets up the upstream remote for you, saving the manual git remote add upstream step.
I wrote a simple script to demonstrate how a DNS zone restore can be achived using the Restore-ADObject cmdlet:
Importing Required Modules: Loads ActiveDirectory and DnsServer modules.
Setting Parameters: Allows specifying a DNS zone name, defaulting to “ehmiizblog”.
Searching for Deleted Zone: Looks for the deleted DNS zone in known AD locations.
Retrieving Deleted Records: Fetches resource records for the deleted zone.
Restoring Zone & Records: Restores the DNS zone and its records to their original names.
Restarting DNS Service: Restarts the DNS service to apply changes.
Output Messages: Provides feedback on the restoration progress and completion.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Disclaimer: Please note that while these steps have been provided to assist you, I cannot guarantee that they will work flawlessly in every scenario. Always proceed with caution and make sure to back up your data before making any significant changes to your system.
Written on 2024-05-08 (Note: Information may become outdated soon, and this was just my approach)
If you’re a proud owner of the 2024 Asus Rog Zephyrus G16 (GU605MI) and running Fedora 40+, ensuring smooth functionality of essential features like sound, keyboard, screen brightness, and asusctl might require a bit (hehe) of tweaking.
Here’s a comprehensive guide, or really the steps I took, to get everything up and running.
The screens brightness works out of the box while on the dGPU.
However that comes with certain drawbacks, like flickering electron applications and increase in power consumption. The steps below gets the screen brightness controls to work in “Hybrid” and “Integrated” mode (while the display is being ran by the iGPU).
Open the grub configuration file:
sudo nano /etc/default/grub
Add the following string at the end of the line GRUB_CMD_LINE_LINUX=:
With these steps, I was able get a somewhat functional GU605MI Fedora system. If you encounter any issues, refer to the respective documentation or seek further assistance from the Asus-Linux community.
Download the latest WinSW-x64.exe to the working directory
# Get the latest WinSW 64-bit executable browser download url$ExecutableName='WinSW-x64.exe'$LatestURL=Invoke-RestMethod'https://api.github.com/repos/winsw/winsw/releases/latest'$LatestDownloadURL=($LatestURL.assets|Where-Object{$_.Name-eq$ExecutableName}).browser_download_url$FinalPath="$($WorkingDirectory.FullName)\$ExecutableName"# Download it to the newly created working directoryInvoke-WebRequest-Uri$LatestDownloadURL-Outfile$FinalPath-Verbose
Create the PowerShell script which the service runs
This loop checks for notepad every 5 sec and kills it if it finds it
Just edit the id, name, description and startarguments
<service><id>PowerShellService</id><name>PowerShellService</name><description>This service runs a custom PowerShell script.</description><executable>C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe</executable><startarguments>-NoLogo -file C:\Path\To\Script\Invoke-PowerShellServiceScript.ps1</startarguments><logmode="roll"></log></service>
Save the .xml, in this example I saved it as PowerShell_Service.xml
# if not already, step into the workingdirectorycd $WorkingDirectory.FullName# Install the service.\WinSW-x64.exeinstall.\PowerShell_Service.xml# Make sure powershell.exe's executionpolicy is BypassSet-ExecutionPolicy-ExecutionPolicyBypass-ScopeLocalMachine# As an administratorGet-ServicePowerShellService|Start-Service
Running a PowerShell script as a service on any windows machine isn’t that complicated thanks to WinSW. It’s a great choice if you don’t want to get deeper into the process of developing windows services (it’s kind of a fun rabbit-hole though).
Meaning the executionpolicy must be supporting that usecase (bypass as local machine will do)
The script in this example is just a demo of a loop, but anything you can think of that loops will do here
Starting the Service requires elevated rights in this example
If you get the notorious The service did not respond to the start or control request in a timely fashion, you have my condolences (This is a very general error msg that has no clear answer by itself it seems)
Git is a powerful and popular version control system, sometimes a bit too powerful.
Depending on how your day went, you may want to restore a file from git to a previous state, either because you made an oopsie, want to undo some changes, or need to compare different versions.
Let’s go through four common scenarios on how to do just that!
The simplest scenario is when you have saved your file locally on your local git repository, but have not staged or committed it yet.
In this case, you can use the git restore command to discard the changes in your working directory and restore the file to the last committed state.
For example, if you want to restore a file named index.html, you can run the following command:
git restore index.html
This will overwrite the index.html file in your working directory with the version from the HEAD commit, which is the latest commit on your current branch.
You can also use a dot (.) instead of the file name to restore all the files in your working directory.
The next scenario is when you have saved your file locally and staged it locally, but have not committed it yet.
In this case, you can use the git restore –staged command to unstage the file and remove it from the staging area.
For example, if you want to unstage a file named index.html, you can run the following command:
git restore --staged index.html
This will remove the index.html file from the staging area and leave it in your working directory with the changes intact. You can then use the git restore command as in the previous scenario to discard the changes in your working directory and restore the file to the last committed state. Alternatively, you can use this command:
git restore --source=HEAD
To unstage and restore the file in one step.
For example, if you want to unstage and restore a file named index.html, you can run the following command:
git restore --source=HEAD index.html
This will remove the index.html file from the staging area and overwrite it in your working directory with the version from the HEAD commit. You can also use a dot (.) instead of the file name to unstage and restore all the files in your staging area and working directory.
The third scenario is when you have saved your file locally, staged it locally and committed it, but have not pushed it to the remote repository yet. In this case, you can use the git reset –hard command to reset your local branch to the previous commit and discard all the changes in your staging area and working directory. For example, if you want to reset your local branch to the previous commit, you can run the following command:
git reset --hard HEAD~1
This will reset your local branch to the commit before the HEAD commit, which is the latest commit on your current branch.
This will also discard all the changes in your staging area and working directory, including the file you want to restore.
You can then use the git checkout command to check out the file from the previous commit and restore it to your working directory.
For example, if you want to check out and restore a file named index.html from the previous commit, you can run the following command:
git checkout HEAD~1 index.html
This will check out the index.html file from the commit before the HEAD commit and overwrite it in your working directory with the version from that commit.
You can also use a dot (.) here as well, to check out and restore all the files from the previous commit.
The fourth and final scenario is when you have saved your file locally, staged it locally, committed it and pushed it to the remote repository.
In this case, you can use the git revert command to create a new commit that reverses the changes in the previous commit and restores the file to the state before that commit.
For example, if you want to revert the previous commit and restore a file named index.html to the state before that commit, you can run the following command:
git revert HEAD
This will create a new commit that reverses the changes in the HEAD commit, which is the latest commit on your current branch.
This will also restore the index.html file in your working directory and staging area to the version from the commit before the HEAD commit.
You can then push the new commit to the remote repository to update it with the reverted changes.
You can also use the –no-commit option to revert the changes without creating a new commit, and then use the git restore or git checkout commands as in the previous scenarios to restore the file to the desired state.
We’ve demonstrated how to restore a file from git in four different scenarios, depending on how far you have progressed in the git workflow.
We have used the git restore, git reset, git checkout and git revert commands to discard, unstage, check out and revert changes in your files and restore them to the previous states.
I hope this post has been helpful and maybe even saved some headache!
If you have any questions or feedback, please feel free to DM me on Twitter or LinkedIn.
Have you ever asked yourself, what module imports the Install-Module cmdlet?
It’s kind of a meta question, check for yourself! Spoiler a bit down for anyone reading on mobile.
Get-Command-NameInstall-Module
CommandType Name Version Source
----------- ---- ------- ------
Function Install-Module 2.2.5 PowerShellGet
I installed PowerShell 7.4 on two different Ubuntu 20.04 WSL distros, and I installed a few modules to benchmark the old trusty Install-Module and the new sheriff in town: Install-PSResource.
The results speak for themselves. PSResourceGet is much faster then PowerShellGet V2.
Speaking about PowerShellGet V2, there’s still a future for this module, but instead of new APIs and features, V3 (currently in pre-release) was converted to a compatibility layer over to the new and faster PSResourceGet.
The parameters of the new PSResourceGet is not supported from calling the older cmdlets, and there’s no official documentation out for PowerShellGet V3 yet, so to me this seems purely for pipeline scenarios where you have code in place that can just use the new functionality. It has less to do with interactive use it seems. Here’s some further reading on the subject.
PSResourceGet seem to me an awesome new module based on it’s speed, so better get used to it’s new syntax because this will be my new main driver for sure.
Get-Command-ModuleMicrosoft.PowerShell.PSResourceGet|sort Name
It’s not only installing modules that’s faster, it’s also very fast at getting installed modules.
Getting installed modules can be very time-consuming on shared systems, especially where you have the az modules installed, so this is a great performance win overall.
Finding new modules and scripts is also a crucial part of PowerShell, especially for the community members. I would argue with PSResourceGet going GA, PowerShell 7.4 is probably one of the most significant performance boosters of PowerShell (in it’s open source life).
As you can see, finding modules is way faster, and here we’re even using two-way wildcards.
Let’s try to use the new Publish-PSResource. I have a minor bug-fix to do on my project linuxinfo and will edit my publishing script so that github action will publish it for me using Publish-PSResource.
I start with editing my very simple publishing script. Since I don’t know if the github-hosted runner will have PSResourceGet installed yet, I need to validate that the cmdlet is present before calling it. If it’s not, I’m simply installing it using PowerShellGet v2.
This should do it!
Hmm, seems like I messed something up. The Github-hosted runner can’t find Publish-PSResource so, it’s trying to install PSResourceGet using Install-Module. However I miss-spelled the module name if you look closely at line 7. It should be Microsoft.PowerShell.PSResourceGet, let’s fix that and re-run my workflow.
Looks way better now!
And there’s a new version of linuxinfo with a minor bugfix. And the Publish-PSResource migration was very straightforward.
In this post, we learned about the origin of Install-Module, being PowerShellGet v2, and it’s predecessor Install-PSResource, being PSResourceGet. We took some cmdlets for a spin and realized that, the new version is easily twice as fast, in some cases even 3 times faster.
We covered PowerShellGet V3 being a compatibility layer and some caveats with it.
We looked at migrating a simple publishing script from Publish-Module to Publish-PSResource.
I recommend to poke around with the new PSResourceGet cmdlets and read it’s official documentation, and for interactive use not rely on any compatibility layer, save that for the edge-cases.
Thanks for reading this far, hope you found it helpful. PM me on twitter for any feedback.