Snippset

Snippset Feed

...see more

If you're upgrading your home internet, Wi-Fi 7 mesh systems promise blazing speeds, lower latency, and better performance in busy, device-packed households. This video breaks down the top 5 options available now, balancing speed, coverage, and value.

Highlights:

  • Budget Pick: TP-Link Deco BE63 — Affordable and fast enough for 4K/8K streaming and gaming. Ideal for smaller homes.
  • Power Users: ASUS ZenWiFi BQ16 Pro — Advanced features, strong security, and deep customization for tech-savvy users.
  • Smart Home Friendly: Amazon Eero Max 7 — Great for Alexa-based homes with built-in Matter/Thread support and easy voice control.
  • Best Value: TP-Link Deco BE85 — High-performance, quad-band Wi-Fi 7 with simple setup and strong security.
  • Top Performer: Netgear Orbi RBE973 — Premium build, ultra-fast speeds (up to 27 Gbps), and elite performance for large homes.

Whether you're streaming, gaming, or building a smart home, there’s a system on this list that fits your needs and budget.

Watch the full breakdown on YouTube

Windows by Burton
...see more

If the built-in Stereo Mix method is unavailable or insufficient:

  • VoiceMeeter: A third-party application that provides advanced audio routing capabilities, allowing for complex configurations of multiple audio outputs.
  • Audio Splitters: Physical devices that can duplicate audio signals to multiple outputs without software configuration.

These alternatives offer additional flexibility for managing multiple audio outputs.

...see more

Wi-Fi 7 mesh systems are the next big thing in home networking, offering ultra-fast speeds, better range, and improved connectivity for demanding households. This comparison breaks down five top contenders: TP-Link Deco BE85 & BE95, ASUS BQ16 Pro, Eero Max 7, and Netgear Orbi 970.

  • Best Overall for Wireless Backhaul: TP-Link Deco BE95 delivered the fastest speeds without cables, ideal for homes where wiring is limited.
  • Most Advanced Configuration: ASUS BQ16 Pro shines with a rich feature set, great performance, and no extra subscriptions — perfect for tech enthusiasts.
  • Fastest Wi-Fi Speeds: Eero Max 7 surprised with top wireless speeds, especially in close-range tests.
  • Best Long-Range Performance: Netgear Orbi 970 stood out in 100-foot range tests, maintaining strong and stable performance.
  • Value Pick: TP-Link BE85 offers nearly identical features to the BE95 at a lower cost.

Each system supports up to 10Gbps speeds and offers mobile apps for setup. However, ASUS stands out with advanced settings and no added fees for features like parental controls or security.

Watch the full video comparison on YouTube

.NET by Jerry
...see more

When building background services in .NET, it’s helpful to include structured logging for better traceability and diagnostics. One common pattern is using logging scopes to include context like the service or task name in each log entry.

Instead of manually providing this context everywhere, you can simplify the process by automatically extracting it based on the class and namespace — making the code cleaner and easier to maintain.


✅ Goal

Replace this verbose pattern:

_logger.BeginScope(LoggingScopeHelper.CreateScope("FeatureName", "WorkerName"));

With a simple, reusable version:

_logger.BeginWorkerScope();

Implementation

🔧 1. Logger Extension

public static class LoggerExtensions
{
    public static IDisposable BeginWorkerScope(this ILogger logger)
    {
        var scopeData = LoggingScopeHelper.CreateScope();
        return logger.BeginScope(scopeData);
    }
}

🧠 2. Logging Scope Helper

public static class LoggingScopeHelper
{
    public static Dictionary<string, object> CreateScope()
    {
        var stackTrace = new System.Diagnostics.StackTrace();
        var frames = stackTrace.GetFrames();

        string featureName = "Unknown";
        string workerName = "Unknown";

        foreach (var frame in frames ?? Array.Empty<StackFrame>())
        {
            var method = frame.GetMethod();
            var type = method?.DeclaringType;

            if (type == null || type.Name.StartsWith("<") || type.Namespace?.StartsWith("System") == true)
                continue;

            workerName = type.Name;

            var ns = type.Namespace;
            var segments = ns?.Split('.');
            if (segments?.Length >= 3)
            {
                featureName = segments[2]; // Assuming format: Company.App.Feature.Workers
            }

            break;
        }

        return new Dictionary<string, object>
        {
            ["Feature"] = featureName,
            ["Worker"] = workerName
        };
    }
}

Example Usage in a Worker

protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
    using (_logger.BeginWorkerScope())
    {
        // Your background task logic
    }
}

This ensures that every log written within the scope will automatically include "Feature" and "Worker" values, without manually specifying them.

...see more

Sometimes configuration files or scripts include identifiers that need to be updated automatically — for example, replacing a generic keyword like "rule-template" with a dynamic name based on a service or environment.

This Snipp shows how to:

  • Replace an exact identifier in a file
  • Normalize that name (e.g. replacing special characters like dots)

Goal

Replace this:

rule-template

With something like:

rule-example-service

Where "example.service" is the dynamic input.

PowerShell Example

# Define the original dynamic name
$name = "example.service"

# Normalize the name (e.g., replace '.' with '-')
$normalizedName = $name -replace '\.', '-'

# Read the text file content
$content = Get-Content -Path "file.txt" -Raw

# Replace the exact identifier
$content = $content -replace 'rule-template', "rule-$normalizedName"

# Save the updated content
Set-Content -Path "file.txt" -Value $content

This ensures that:

  • Only exact matches of "rule-template" are replaced
  • Any special characters in the name are safely converted

Summary

This method is useful when working with reusable config files across different services or environments. PowerShell makes it easy to normalize and apply names consistently, reducing manual edits and potential mistakes.

DevOps by Patrik
...see more

When managing configuration files (like YAML for CI/CD), you may need to replace placeholder values with actual items — for example, when working with multiple components or environments.

This guide shows how to automate two common tasks using PowerShell:

  1. Replacing a placeholder line with multiple entries.

  2. Dynamically inserting names where specific patterns are used.

Task 1: Replace a Placeholder Line with a List

Suppose your YAML file contains this line:

  - "{item}/**/*"

You can replace it with multiple entries, like:

  - "ServiceA/**/*"
  - "ServiceB/**/*"

PowerShell Example:

# Define the list of items
$itemList = "ServiceA,ServiceB" -split ',' | ForEach-Object { $_.Trim() }

# Read the YAML content
$content = Get-Content -Path "input.yml" -Raw

# Build the replacement block with correct indentation
$replacement = ($itemList | ForEach-Object { '  - "' + $_ + '/**/*"' }) -join "`n"

# Replace only the exact placeholder line
$content = $content -replace '(?m)^ {8}- "\{item\}/\*\*/\*"', $replacement

# Write the updated content
Set-Content -Path "output.yml" -Value $content

Using PowerShell to manage placeholders in configuration files helps streamline setup for dynamic environments or multiple components. These techniques are useful for automating CI/CD pipelines, especially when working with reusable templates and environment-specific configurations.

Learning by Patrik
...see more

Most people fail at learning new skills not because they aren’t trying hard enough, but because they fall into a trap called "Theory Overload." This happens when we try to learn too much at once—cramming in ideas without giving ourselves time to build habits through practice.

The Real Key to Learning: Experiential Cycling

To truly learn, we need to go through a cycle:

  1. Try something (practice)
  2. Observe the result
  3. Reflect on what to improve
  4. Try again with adjustments

Without this loop, progress stalls—just like shooting arrows without adjusting your aim.

Balance Theory with Practice

Learning is mentally demanding. Our brains have limited capacity, especially when new skills aren't yet habits. Trying to juggle too many techniques at once leads to cognitive overload, where nothing sticks.

To avoid this:

  • Only add new theory when older skills become automatic.
  • For every 1 hour of theory, aim for at least 5 hours of practice.
  • If something feels easier and faster without you trying to go faster, it means you're forming a habit—this is the green light to introduce new ideas.

Takeaway

The fastest way to learn is often the slowest. Focus on forming habits, balancing input, and not rushing through content. Sustainable growth beats cramming every time.

Watch the full video by Justin Sung

DevOps by Patrik
...see more

When working with GitLab CI/CD, workflow:rules help control whether a pipeline should run at all. It’s important to understand how these rules work because they differ from regular job rules, especially when it comes to supported syntax.

✅ What You Can Do in workflow:rules

  • Use basic comparisons like == or !=
  • Use logical conditions with && and ||
  • Use functions like startsWith($VARIABLE, "value")
  • Use $CI_COMMIT_BRANCH, $CI_COMMIT_TAG, and similar predefined variables
  • Wrap the full condition in single quotes

Example:

workflow:
  rules:
    - if: '$CI_COMMIT_BRANCH == "main" || startsWith($CI_COMMIT_BRANCH, "feature/")'
      when: always
    - when: never

❌ What You Can’t Do in workflow:rules

  • No regular expressions (e.g., =~ /pattern/) — these only work in job-level rules
  • No complex YAML syntax like nested objects inside if
  • No unquoted expressions — always quote the full condition

💡 Pro Tip:

If you need to skip certain runs based on the commit message (e.g., [nopublish]), do that inside job rules, not workflow:rules.

some_job:
  rules:
    - if: '$CI_COMMIT_MESSAGE =~ /\\[nopublish\\]/'
      when: never
    - when: always

Conclusion

Use workflow:rules to define when a pipeline runs, based on simple branch or tag conditions. Keep regex and detailed logic in job-level rules to avoid syntax errors.

AI by Josh
...see more

If you're using ChatGPT and can't find a conversation you previously had, you might wonder whether it was archived and how to get it back. Here's a quick guide to understanding how ChatGPT handles old chats and how to find them again.

Where Are Archived Chats?

ChatGPT doesn’t currently have a separate “Archived” section like some messaging apps. However, chats you’ve had are saved automatically unless you manually delete them.

Here’s how to access them:

  • On Desktop (chat.openai.com):

    1. Log in to your account.
    2. Use the search bar at the top of the left sidebar.
    3. Type keywords from your previous chat to bring it up.
  • On Mobile App:

    1. Open the ChatGPT app.
    2. Tap the menu (☰) or swipe from the left.
    3. Use the search function to look through older chats.

Things to Keep in Mind

  • Archived chats are not truly "hidden" — they're just not pinned or recent.
  • If you deleted a chat, it’s gone permanently.
  • Make sure you're logged into the correct account, especially if you’ve used multiple sign-in methods (like Apple ID, Google, or email).

There’s no “Archived” folder in ChatGPT, but you can use the search tool to find older chats. Deleted chats can’t be restored, but if not deleted, they remain available through the search feature.

DevOps by Patrik
...see more

When building CI pipelines in GitLab for multiple projects, you often need to pass a list of project names to a script. However, GitLab CI doesn’t support arrays as environment variables. The best solution is to pass the values as a comma-separated string and split them inside your PowerShell script. This method is clean, compatible, and easy to maintain.

Implementation

Step 1: Define the project list as a CSV string in .gitlab-ci.yml

variables:
  PROJECT_NAMES: "ProjectA,ProjectB,ProjectC"

Step 2: Pass it as a parameter to the PowerShell script

script:
  - powershell -ExecutionPolicy Bypass -File .\Create-Pipeline.ps1 -projectNamesRaw "$env:PROJECT_NAMES"

Step 3: Process the string inside the PowerShell script

param(
    [Parameter(Mandatory)]
    [string]$projectNamesRaw
)

# Split and trim project names into an array
$projectNames = $projectNamesRaw -split ',' | ForEach-Object { $_.Trim() }

foreach ($projectName in $projectNames) {
    Write-Output "Processing project: $projectName"
    # Your logic here
}

Why This Works

  • GitLab treats all variables as strings, so this approach avoids format issues
  • -split creates an array inside PowerShell
  • Trim() ensures clean names even with extra spaces
Add to Set
  • .NET
  • Agile
  • AI
  • ASP.NET Core
  • Azure
  • C#
  • Cloud Computing
  • CSS
  • EF Core
  • HTML
  • JavaScript
  • Microsoft Entra
  • PowerShell
  • Quotes
  • React
  • Security
  • Software Development
  • SQL References
  • Technologies
  • Testing
  • Visual Studio
  • Windows