Automating the removal of TDS Projects

As part of the recent work I’ve done on migrating from TDS to SCS I had some tidying up to do. I needed to remove the TDS projects from the solution and all the files and sub-folders, including the .item files.

We had around 66 TDS projects in the Solution so this wasn’t something I wanted to do manually as It would take a while and I’d likely missing something. I figured someone must have written a script to do this but I couldn’t find anything. So I created a PowerShell Script myself to do this.

Note 1: There is a dependency here on the dotnet cli. This was the most efficient and supported way of cleaning up the solution file and most developers will likely have this installed already.

Note 2: I setup SCS to create my serialized items in an folder called scs. This was so all the legacy TDS items could stay in the serialization folder instead to separate them. If you’ve not done this you may need to modify the script below for your needs to re-configure the SCS folder before using my script.

If you save the script to your machine from the Gist below it will look something like this:

Show me the Script Already

When you open the script you need to set the variables for your Solution path, Solution name and if projects, files, folders and parent folders should be removed.
Most of these should be set to $true but depending on your folder structure you might need to set $tidyUpParentFolders to $false:

After the script has run you will see an output showing what the script has removed, including project files, item files and folders:

One you’ve run it check the removed files look correct before checking in your changes to source control. Also ensure your solution still builds and there are no issues.

Hopefully this is useful for anyone else who makes the move to TDS from SCS.

For more on Migrating from TDS to SCS read: A Guide to Migrating from Sitecore TDS to SCS

A Guide to Migrating from Sitecore TDS to SCS

A client I’m currently working with recently upgrade to Sitecore 10.3 and I was tasked with migrating our TDS projects to the much newer Sitecore Content Serialization (SCS). I imagine a lot of people need to do this so thought I’d share the approach I took and steps I followed to do this. 

Why migrate to SCS?

SCS is now the de facto standard for Content Serialization for Sitecore – not only for Sitecore XP but also for XM Cloud too. It has now matured to a point where it has all the features we should need as Sitecore developers and is well established. It’s also built in an modern way so is easy to use and very fast.  

If you are currently using TDS then this is now end of life/legacy and will likely soon not be supported. In addition to this; managing items via TDS is tedious and slow. With SCS we can just run the watch command and pickup changes from Sitecore without having to manually add items, which will be a lot quicker. You also need a license for TDS and don’t for SCS (via the CLI at least). 

I‘ve always preferred Unicorn to TDS, but on the current project I’m working on TDS was already implemented. If you are using unicorn though there are similar approaches to migrating from Unicorn to SCS too.

Differences between TDS project files and SCS Modules

TDS Project files are a proprietary XML based and contains list of all the items to Serialize, they are designed to be used via the TDS Visual Studio plugin to manage items.

SCS Module files on the other hand are JSON based files which include a list of the files to sync. The items are then serialized to Disk as .YAML files. The format is a little different for these but essentially the definition is quite similar. Here is a nice comparison from Jason St-Cyr’s blog post:

Approaches to Migration

There are two different approaches to migration to consider here.

Manual Migration

You could manually migrate each TDS project item by item but unless your working on a very small Sitecore project or you’ve not got many items in source control this will likely take quite some time so I wouldn’t recommend it.

Tools to Automate Migration

I spent quite a while looking around the Sitecore community to see what was already out there and there were 3 pretty good options that I found:

  1. Sitecore Serialisation Converter by Joseph Long – .Net console app. supports item and role serialization, full featured: save path/relative save path, ignore list etc.
  2. A PowerShell Script by Aaron Bikle – impressive power shell script, less feature rich than Sitecore Content Serializer but does support an ignore list and consolidate paths which is nice.
  3. TDSC by Matthew Ellins – a .Net Core console app. Supports item serialization but not roles. Not as full featured as Sitecore Serialisation Converter.

I looked at all of these and tested them but ultimately landed on Sitecore Serialisation Converter as it was the most full-featured, robust and flexible option.

I did however make quite a few updates to it and raised an PR – which has been merged today by Joseph so you can take advantage of them too:

  • Improved the error handing by adding more checks around various aspects and logging out all errors/warnings and progress
  • Added logging via log4net (ssc-skipped.log and ssc-all.log)
  • Added tracking of errors/stats with summary at end of success, failed, skipped etc
  • Automation of the Json module listing to add to your Sitecore.json file
  • Added support for removing ‘TDS’ from module/namespaces (StripTDSFromName option) – for if you have TDS included in the name of your TDS projects and want to remove it.
  • Added support for Filtering on just  specific project name (ProjectNameToMatch option) – to allow running on a single project/testing. Leave blank to include all Projects.
  • Added a new SkipCreateIfExists option to not create the Json file if it already exists.
  • I also stole some of Aaron’s exclude list and added that :-).

More info below about how to use this.

Steps to migrate to SCS from TDS

The following steps give you an overview of the process I followed to migrate to SCS. It’s probably not the only way to do this but it worked well for me and is likely pretty close to the approach you will need to take to do this:

  1. Use Sitecore Serialization Converter to create new SCS Json module files from your existing TDS projects
  2. Update the Sitecore.Json file to include the new modules
  3. Install the Sitecore CLI and run the dotnet sitecore init command to create the Sitecore.json file.
  4. Install the SCS Plugin using the following command:
    dotnet sitecore plugin add -n Sitecore.DevEx.Extensibility.Publishing
  5. Login the CLI to Sitecore
  6. Validate the module files with the command dotnet sitecore ser validate -v and modify them as required
  7. Run dotnet sitecore ser pull -v to sync your items to disk
  8. Resolve any issues with running the pull commands
  9. Push the new modules and .YAML files to source control
  10. Remove TDS projects and TDS item files from the solution
  11. Test the sync to another environment using the -whatif param
  12. Update your release pipeline to remove the TDS sync and add the SCS sync (a blog post will be coming soon on this)

Note: if your struggling with setting the CLI then this is a really useful guide.

I’ve included some tips below on using the SCS commands.

Using Sitecore Serialization Converter

  1. Pull the project from GitHub
  2. Open the project in Visual Studio
  3. Edit the appsettings.config file as follows:
    ProjectDescription – set this to your project name, it will be added to your Module files.
    SolutionFolder – set this to the root of your Solution folder. TDS projects will be found from here.
    SavePath – you probably only want to set this for testing so all module files are created in the same place (use relative path instead to create them in each feature/foundation/project folder).
    UseRelativeSavePath – set to false when testing using the SavePath above, but true when your ready to create the module files in a relative location.
    RelativeSavePath – set this if UseRelativeSavePath is set to true above. For me this path was ../.
    StripTDSFromName – set to true if you have the text ‘TDS’ in your project names and it will be removed when creating the .Json files.
    ProjectNameToMatch – set this to filter on one or more projects. Useful for testing or for just targeting specific projects. Leave blank to find all TDS projects.
    SkipCreateIfExists – set to true if you want .Json files to be skipped instead recreated if they already exist. Useful if you’ve run SSC a few times.
  4. Either Run the project in debug mode in Visual Studio OR Build it and go to the .exe in the output folder via command line and run the .Exe like so:
    SitecoreSerialisationConverter.exe
  5. Check the console/logs for any issues or errors, you should find these in the debug folder
  6. Check the summary looks correct in the Console and the Logs and no projects or important items are missing
  7. Copy your Module list from the output in the Console to use in your Sitecore.json file, e.g:

    "modules": [ c:/temp/MyProject.Project.Master.module.json",
    c:/temp/MyProject.Feature.Banner.Master.module.json",
    c:/temp/MyProject.Foundation.Caching.Master.module.json
     ],


Tips on Serialize Commands

  • Verbose – Run all the commands with -v after them so that you get full debugging of any issues. e.g: dotnet sitecore ser validate -v -i MyProject.Feature*

  • Filters – Use the -i param to filter by a specific project name pattern, e.g: dotnet sitecore ser pull -v -i MyProject.Feature* – will match all projects starting with ‘MyProject.Feature’

  • Validate – Use the validate command to check the your Module Json file paths are valid. e.g: dotnet sitecore ser validate -v. You can combine this with -i to validate just one project at a time.

SCS Issues and Errors

I hit a bunch of errors when trying to run the pull and/or validate command. I think I had perhaps 100 errors or more to resolve across around 60 converted TDS projects so it took me quite some time to resolve them all. It’s definitely easier if you target on Project at a a time, fix the issues, commit to source control and move to the next though. Here are the issues I tracked and what I had to do to fix them.

Duplicate Paths cannot be serialized

If this happens it means you have multiple module files which have references to the same item/path. You need to find the duplicate references and remove the path from all but one Module file and try and run the command again.

Non Unique Paths cannot be serialized

If this happens it means an item with the same name exists more than once in Sitecore and it’s trying to serialize it to disc as an YAML file but can’t create the item more than once with the same path name.

To fix it remove one or more copies of the item from Sitecore and try and run the command again. You will need to work out which is the correct one to remove. I used SPE to remove items by ID to speed this up.

Serialized item contained a blank value

You will see an error like so if this happens:

Check your serialised item, you will see there is no language set. Then check the language versions in Sitecore and look for an Invariant Language. I found I had to run an SQL command on my master and core database to remove version languages:

DELETE FROM [sitecore_master].[dbo].[VersionedFields] WHERE [Language] = ''

DELETE FROM [sitecore_core].[dbo].[VersionedFields] WHERE [Language] = ''


Blank Database Values

There were some situations where the database in my Json files ended up empty for some reason. I did make some updates to the Sitecore Serialization Converter to fix this by defaulting the value to Master if the Project name contains ‘Master’ and Core if it contains ‘Core’. But if this happens to you then you will need to fix this manually in the Json files else SCS will throw errors:

Cache already contains GUID

I also had some errors related to items already existing in the cache, something like:

/sitecore/content/* (GUID) attempted to be cached, but the cache already contained /sitecore/content/*. Non-unique paths cannot be serialized. Please choose a different item name.

If you get this error then you should be able to fix it with the following command or perhaps a combination of this fix and the non unique items fix:

dotnet sitecore ser validate --fix

Source Control

One other thing to consider is how you approach adding your changes to Source Control to allow for easy tracking of changes and roll-back etc. I did this my adding all my Json module files first and then pulling each Helix layer (feature/foundation/project) one at a time and pushing the Yaml files in batches.

Next Steps

The next steps for me are adding the integration with the CLI to Github Actions, pushing the items from Source control to other environments and removing the TDS projects and serialized files. I will write about this in a future blog post.

Links

As usual there were a lot of links that helped with this, here are some of them:

https://sitecore.stackexchange.com/questions/36312/how-to-migrate-from-tds-to-cli

https://amitkumarmca04.blogspot.com/2021/07/Sitecore-Content-Migration-Using-Sitecore-Content-Serialization.html

https://spyn.me/sitecore-migrating-from-tds-to-scs/

https://sitecore.stackexchange.com/questions/33034/guid-attempted-to-be-cached-but-the-cache-already-contained-sitecore-c

https://jasonstcyr.com/2023/11/28/converting-a-tds-project-to-sitecore-content-serialization-scs/#:~:text=Migrating%20from%20TDS%20to%20Sitecore,start%20working%20with%20XM%20Cloud!


Hopefully this is useful to others who need to do this.

Sitecore XP is alive & kicking – 10.4 is out now

Unfortunately I wasn’t able to attend Sitecore SUGCON in Dublin a couple of weeks ago as I already had a family trip planned for Easter. However the clear message coming from SUGCON is that Sitecore have a renewed focus on their customers who are still on the XP and XM platforms and are investing in them significantly in 2024 and beyond with support for DXP until at least 2032.

TLDR: Sitecore XP 10.4 is out now and is definitely worth a look. You can download it here and read more about it below.

A new Product focused CEO

You might have seen last week that Dave O’Flanagan was appointed CEO of Sitecore, this has really resonated with the Sitecore Community and partners as Dave O’Flanagan joined Sitecore as part of the acquisition of Boxever (where he was CEO) back in 2001 and therefore is more product focused. This in my view is great news for the platform and will hopefully see customers seeing an better ROI and Sitecore move back into the Leaders category in the Gartner Digital Experience Platforms Magic Quadrant.

For those who were thinking XP/XM were dead, thankfully this does not seem to be the case at all. This is proven by the release of Sitecore 10.4 out today which has a number of improvements and features that Sitecore XP customers will appreciate.
In this blog post I’ll provide an overview of some of the new features and improvements included and discuss what to expect in future releases of the DXP.

What can we expect in future XP/XM releases?

Sitecore talked at SUGCON about improving the User Experience for Marketers and creating a better Developer Experience for Sitecore developers too. Sitecore UX improvements in XP will be a welcome update as the interface has been due some modernisation for a while – especially experience editor.

There will also be more Composible integrations coming to make it easier for customers to gradually move towards an SaaS approach.

well actually maybe you can 🙂 …

AI integrations have been promised across all of Sitecore’s platforms and products – including XP/XM. We don’t really know what this will involve or look like currently but this is likely to be across content creation, search, content testing, personalisation and more. I actually generated the header image for this post with AI, so It will be interesting to see how this all works and could be a great addition for Sitecore XP users.

One of the strategies talked about at SUGCON is releasing Modules for Sitecore XP/XM to avoid having to wait for large platform releases. You will see this in practice already in the 10.4 release (and actually before) with some of the XM Cloud / CDP migration modules Sitecore have been working on to help customers move to SaaS from XP. Read on for more details.

Whats new in Sitecore XP 10.4?

This isn’t an exhaustive list but the highlights I’ve picked out that I think may be important to a number of customers. If you want to see a list of everything included you can read the release notes here. There are over 200 enhancements to functionality, robustness, performance, and supportability in total.

Accessibility Improvements

10.4 is now compliant with the W3C ARIA Authoring Practices Guide (APG) – so content editors can now navigate the content tree, ribbon and fields via the keyboard as well as create, rename, move, and delete items and interact with fields. A visual indicator is also provided to allow visually impaired users to know where the cursor is (see aria tags and purple outline below).
Accessibility will be a focus for a lot of Sitecore customers in the coming months with the new European Accessibility Act 2025 coming into force soon.

Modules / Tools to help with Migration to SaaS

So I mentioned earlier that Sitecore are creating modules outside of the core platform for various features and functionality which can be released independently of the platform. Well 10.4 ships with an updated module and an brand new Module, which are both designed to help customers to move from XP/XM to SaaS quicker and easier.

XM to XM Cloud Content Migration tool

Despite an version 1 of this tool actually existing since December 2023 I totally missed it had been released. However with 10.4 the new version of the tool available today provides more functionality than the old version. The tool is compatible with any version of Sitecore from 10.1 onwards and allows you to migrate content, media and user data to XM Cloud.

Key features are:

  • GUI (Graphical User Interface) and CLI (Command-Line Interface).
  • Select content items and media items with a TreeView.
  • Migrate media items using CDN or media binaries stored in on-premise MSSQL database.
  • Migrate users to an environment in Sitecore Cloud Portal.

Note: the tool will work with XP but it is designed to export XM only content. It will therefore not export site layouts/renderings, xDB data, commerce data as these are not supported in XM Cloud and don’t exist in XM.

This should really simplify the process of moving your XP/XM data to XM Cloud when migrating. More features and functionality are likely to be added over the coming months.

xDB to CDP Migration Tool

This is a brand new tool available today with the release of 10.4 which allows customers to export their xDB data to the CDP platform. It utilises Sitecore Connect and region specific recipes provided by Sitecore to to extract the data and import it into CDP. The tool can be used with any version of Sitecore from 9.0 upwards and supports Contact facets (out-of-box and custom) and Contact List names.

Image from the https://doc.sitecore.com/ site

I’ve not had chance to experiment with it yet but it looks like it could be really helpful for customers looking to move to XM Cloud / CDP from XP.

XP Analytics Extractor

This is another new open source module which allows you to export interaction data from xConnect to a SQL database or CSV file. The tool is compatible with all versions of Sitecore XP from 10.0 onwards. It means you can then Combine xDB data with other data sets and also use Power BI to create reports. There isn’t much information about this currently from what I can see, but I believe one of the benefits of this is being able to reduce the data you store in xConnect by pruning it but maintain it elsewhere for reporting purposes.

Codeless Schema Extension module

I couldn’t find much on this module but a new Codeless Schema Extension module is available in Sitecore Connect which enables business users to extend the xConnect schema without needing to write any code. I will include more detail here once I’ve had chance to look at it in more detail.

SXA Updates – Tailwind support

You can now choose to use the Tailwind grid system for SXA instead of bootstrap or one of the other grid systems supported. This is a interesting addition and gives more options to those using SXA on XP.

Image from the https://doc.sitecore.com/ site

Javascript Library Updates

I believe this was a lot of work for the Sitecore development team and I can see why. Upgrading these across the XP platform was no doubt quite tricky. I’ve taken a look at some of the updates in 10.4 and the versions are now a lot newer which means that there are less security vulnerabilities (I believe there are some in older versions of jQuery for example).

Some updates of note that I’ve seen are as follows:

  • jQuery v1.12.4 (released in 2016!) > jQuery v3.6.3 (released in December 2022)
  • jQuery UI v1.10.3 (released in July 2013!) > jQuery UI v1.13.2 (released in July 2022)
  • Backbone.js v1.0.0 (released in 2013!) > Backbone.js v1.4.1 (released in 2022)
  • Bootstrap v3.2.0 (released in 2013!) > Bootstrap v3.4.1 (released in 2019)
  • Prototype v1.7 > Prototype v1.7.3 (not such a big upgrade)
  • RequireJS v2.1.10 (released 2014) > RequireJS v2.3.6 (released in 2018)
  • Sitecore Speak JS v1.0.1 (released 2013) > Sitecore Speak JS v1.0.3 (released 2023)

Security Improvements

The release notes say “Enhanced security and supportability across the product and in 3rd party libraries, to reduce potential vulnerabilities, and to reduce the likelihood of requiring security updates in the future”.
Looking at the list there are approximately 30+ security fixes/improvements included in 10.4. It’s imposible to know exactly what these security issues are but it’s prudent to upgrade if you can just to eliminate your exposure to these security issues.

Compatibility Updates

Sitecore XP 10.4 adds supports for: SQL Server 2022, Solr 8.11, latest Azure Kubernetes Service and also containers for ltsc2022 images. Support for ltsc2019 images will be coming in the next few weeks.


So in conclusion, whether your currently considering a move to the new SaaS XM Cloud platform (but not took the plunge yet) or plan to stay on XP/XM for the short or long-term is probably makes sense to upgrade to Sitecore 10.4 in the near future if you are able to do so in order to take advantage of these improvements.

Hopefully this is a useful guide for others wondering about Sitecore XP 10.4. If you’d like to install it then the quickest way to do so is usually SIA but you can use SIF if you like.

Enabling Code Coverage for Sitecore with Coverlet & Github Actions

Last week I was tasked with enabling Code Coverage in our Sitecore Visual Studio solution and getting it into CodeCov (via our build pipeline). I ended up going down quite the Rabbit hole of different options and hitting a lot of brick walls along the way.

I finally figured it out and got it working though so thought I’d share my findings and what I did in the end to get this working.

TLDR - add this to your CI workflow in Github actions and adjust the settings as required.

What is Code Coverage?

In simple terms it gives you an idea of how many of your lines of code are covered by tests and therefore how confident you can be in making changes and releasing without breaking things. I’m not going to get into if this is a good idea, how accurate it is as a indication of the quality of your tests or if it’s a waste of time here – as I was just asked to get it setup and working. I don’t think were aiming for 100% code-coverage but we want to know the level of coverage we have and where we need to improve it. By the way the header image above is a lie (I hacked it together) – 100% sure looks nice though :-).

What Code Coverage options are there?

There are quite a few, but some of them are paid for. Given the cost cutting across the board at the moment I felt free ones were best to investigate first. The ones I looked at were as follows:

Selected Tools

Read more below on reasoning but in the end I went with the following:

After trying AltCover for a while and struggling to get the filtering working on various dlls I decided to try Coverlet. Coverlet seems to be the defecto standard and is included by default in ASP.NET 6.0+ projects and .NET Core projects in Visual Studio now.

As our Sitecore 10.3 project is traditional MVC, we are tied to .NET 4.8 framework. Also our projects are fairly legacy and have been upgraded a few times. Therefore it’s not possible to install Coverlet as an NuGet package within the test projects and use MSBuild as Id like to have. It seems this is only possible for newer SDK style projects or .NET core ones and not .NET Framework classic projects. So I had to instead go for using the Coverlet console – which in the end worked pretty well.

How do I use it?

So first you need to install it the coverlet console globally like so:

dotnet tool install --global coverlet.console

Then for each of your test projects you need to execute a command like so:

coverlet "C:\Projects\sc103-flux\src\Foundation\Accounts\Tests\bin\FluxDigital.Foundation.Accounts.Tests.dll" --target "C:\Program Files (x86)\Microsoft Visual Studio19\Community\Common7\IDE\Extensions\TestPlatform\vstest.console" --targetargs "C:\Projects\sc103-flux\src\Foundation\Accounts\Tests\bin\FluxDigital.Foundation.Accounts.Tests.dll /Parallel /Logger:TRX" --output "C:\Projects\sc103-flux\coverlet\coverlet-report1.cobertura" --format cobertura --include "[FluxDigital.*]*" --verbosity detailed

What this does is pass your test project dll to Coverlet and tell it to run Xunit to execute the tests. We also send some Params to XUnit to ensure it runs the the tests in parallel and logs out to the console. Lastly we pass some Params to the coverlet to tell it to filter on certain dlls as – otherwise it seems to try and monitor/test 3rd party dlls as well as our code. If you get any errrors in the console it might be because you are not filtering everything out you need to.

So to break it down in more detail:

  • coverlet – runs the coverlet console
  • “..\FluxDigital.Foundation.Accounts.Tests.dll” – this is the test project dll to run codecoverage on
  • –target  ..\vstest.console” – the path to the VSTest console, ensure this path is correct for your version of Visual Studio
  • /Parallel – runs the tests in VSTest in Parallel
  • /Logger:TRX – log out details to the console from VSTest
  • –targetargs “..\FluxDigital.Foundation.Accounts.Tests.dll” – the path to the dll your are testing again. This time for VSTest
  • –output “..\coverlet-cobertura1.cobertura” – the report file saved at the end of the test run.
  • –format cobertura – format for the above report file (this format allows us to merge the files from different test runs)
  • –include “[FluxDigital.]” – this paramater lets you filter out assemblies (dlls) and/or method to include by name. In my case I only want to include the CodeCoverage of dlls that start with “FluxDigital.” so this filters to just include these. I think you can actually add multiple include params if you wish (see below). 
  • –exclude “[]Model” –exclude “[FluxDigital.Foundation.Models]” –exclude “[]Controller*” – I’m not actually using these filters in my command above but if you want to you add multiple exclude parameters, e.g to exclude any Models or Controllers from Coverlet for example. 
  • –verbosity detailed – This tells Coverlet to output a lot of detail when running the code coverage, it’s really useful for debugging any issues.

I found some info here on include/exclude filtering and it was really helpful. Essentially patterns in brackets [my.dll.name] are assemblies and patterns outside of brackets “*my.class.name” are classes/methods.


Once it runs you will get a code coverage report which you will note is In the cultura format. The reason for this is that we want to merge all of our tests into one code coverage file and other formats don’t work for this. More on this later. 

You need to run a similar command (change the test dll and report name) for each test library and save the code coverage file out with a different name but in the same folder. In my case this was 9 test projects and therefore 9 code coverage files generated. Like so:

Running this 9 times in our build pipeline isn’t going to cut it, so you will see I solved this later using PowerShell to find all test dlls and run these commands automatically – but I wanted to explain how this works more simply first.

Report Generator

To merge them I used ReportGenerator. We will also use this tool later to upload the report to CodeCov. First we need to install it like so:

dotnet tool install -g dotnet-reportgenerator-globaltool

Then with the following command we can merge the files (ensure the path is correct to find your individual cobertura report files):

reportgenerator "-reports:C:\Projects\sc103-flux\coverlet\*.cobertura" "-targetdir:C:\Projects\sc103-flux\coverlet\report" -reporttypes:Cobertura

This gives us an Cobertura xml file with all code coverage data blended into one and generates an html report from it.

If you open up the index.html file in your browser you will see a summary of your Code Coverage at the top and then a breakdown by Assembly below that. Hmm 22%, not great at all. We have some work to do here to improve this, but that’s a job for another day.

This report is pretty neat though and is already enough for you to see where the gaps are in your coverage so you can decide where you need to add more tests.

Putting everything into Github Actions

The next step is to run this in the build pipeline (in our case Github Actions) and use Report Generator to send the file to CodeCov.

Running Coverlet via PowerShell for all Test Projects

A mentioned earlier in order to make this simpler to run in the build pipeline and maintainable I decided to write an PowerShell script which finds all test dlls that match a specific pattern (it ensures an unique list) and then executes the coverlet command (from above) for each dll in turn with VSTest Console.

This is what I came up with:

$basePath = "."
$reportPath = "coverlet"
$incNamePattern = "*Fluxdigital*test*.dll"
$incVSTestNamePattern = "[Fluxdigital.*]*"

#get all test dlls in the solution - filter here to reduce duplicates
$testdlls = (Get-ChildItem $basePath -include $($incNamePattern) -recurse | ? {$_.FullName -match 'Release' -and $_.FullName -notmatch 'obj' -and $_.FullName -notmatch 'LocalPublish'}).FullName 
        
#write-host "$($testdlls.Count) test dlls found..."
[System.Collections.ArrayList]$uniquedlls = @()

#ensure we only get each test dll once by adding them to an arraylist
foreach ($testdll in $testdlls){
    $fileName = [System.IO.Path]::GetFileName($testdll)
    #write-host "checking for $($fileName)"
    if($uniquedlls -match $fileName){
#write-host "allready in array"
    }
    else{
$uniquedlls.Add($testdll) | out-null 
    }
}

#run coverlet for each test dll in the list
write-host "$($uniquedlls.Count) unique test dlls found..."
foreach ($uniquedll in $uniquedlls){
$fileName = [System.IO.Path]::GetFileName($uniquedll)
$cmd = @"
coverlet $($uniquedll) --target "vstest.console.exe" --targetargs "$($uniquedll)" --output "$($reportPath)\coverlet-$($fileName.Replace('.dll','')).cobertura" --format cobertura --include "$($incVSTestNamePattern)" --verbosity detailed
"@
write-host "running tests for: $($fileName) - report path: $($reportPath)\coverlet-$($fileName.Replace('.dll','')).cobertura"
$($cmd) | cmd
}

This is used in the Github Action below so you will need to update the $incNamePattern and $incVSTestNamePattern to match your test dlls when using it in your Github workflow. You could obviously just use it locally to generate a report too.

The Final Github Actions YAML

In order to use Coverlet, VS Test, Report Generator in Github Actions I needed to add some steps in the build pipeline to install the tools. I also wanted to show the code coverage in the Github action summary so eventually found an market place action that would do that (and work with Windows runners) and then finally an action to send the report to Code Cov. Note you will need to update this action with your repo details and Code Cov token (in secrets).

Please review all the settings below too before trying this in your CI pipeline:

Just like running Coverlet locally from the command line you get a summary as it runs in Github too so it’s easy to debug any issues:

The report summary looks like so, pretty cool I think. You can configure this to work for PRs too if you wish.

Once you have this all working you may need to reduce the log levels so it’s not as noisy in the console.

Incidentally AltCover seems very clever and if you can get it to work correctly for you might be better than Coverlet, so give it a try also if you have time.

Hopefully this is useful for others who need to get Code Coverage setup for legacy Sitecore MVC projects (or other older .NET Framework projects). I’m sure a very similar approach would work in Azure Devops or other CI/CD platforms too. I’m off to write some more Unit tests.

As always there were a lot of useful links out there that helped me with this in addition to the ones I’ve included above:

https://blog.ndepend.com/guide-code-coverage-tools/
https://medium.com/@justingoldberg_2282/setting-up-code-coverage-with-net-xunit-and-teamcity-for-a-solution-with-multiple-test-projects-5d0986db788b

https://stackoverflow.com/questions/67058242/using-coverlet-with-net-framework-generates-an-error-the-expression-system-v

https://stackoverflow.com/questions/60707310/is-it-possible-to-get-code-coverage-of-net-framework-project-using-coverlet-in

https://stackoverflow.com/questions/60838586/how-to-output-code-coverage-results-file-of-solution-tests-to-solution-directory

https://stackoverflow.com/questions/62512661/how-to-generate-line-coverage-report-with-vstest-console-exe

Sitecore Page Exporter

Something I need to do regularly is pull down a page from an higher environment (such as UAT or Production) to my local machine or Test. I’ve done this in the past by manually building packages, using Sitecore Sidekick or SPE’s ‘Quick Download Tree as package’ option.

However the SPE’s package option does not support packaging up the datasource items (unless they are child items of the page). In my experience there are often global datasources that are not sub-items of the page. This can take quite some time to do manually, especially for large pages.

Enter Sitecore Page Exporter

So I decided to create ‘Sitecore Page Exporter’ using SPE which will handle this. It supports exporting a specific page as an package and optionally the datasources, images and sub-items. This is v1 so I plan to add more features in the near future.

Pre-requisites

You must have Sitecore PowerShell Extensions installed. This release has been tested with Sitecore 10.3 and SPE 6.4 but should work with older versions also.

Install Notes

  • Download the v1 package from the release link
  • Install the package using the Sitecore package install option in the Sitecore Desktop
  • You should now have Sitecore Page Exporter installed under the SPE module:

Usage

  • To export an page right-click the page in the Content Editor and choose: Scripts > Export Page as Package:
  • The following options are then available to you:
  • Choose your options and Click ‘OK’
  • Download and save the package
  • You get an overview of the export if you click ‘view script results’:
  • You will get an summary at the end of the number of items included also:
  • Upload the package to where you want to use the page (e.g your development machine)

Hopefully this is useful for others too. Let me know of any features you think might be added or any issues you have with this.

Automating Sitecore Azure SQL Database Maintenance

For a long time Sitecore have recommended that you run SQL Maintenance regularly and rebuild the indexes. However you can’t run maintenance plans like this (as you would in an On-Prem environment) in Azure.

So I did some research and it seems that Sitecore set these up for you if you using Managed Cloud. but I couldn’t find much further info on this.

However I did come across this SSE post with a very useful answer from Richard Hauer on using Azure Runbooks and PowerShell to run database maintenance.
There was unfortunately not a lot of detail on how to set it up or use it, I’d only really used Azure Runbooks once before for monitoring and re-starting Solr – so I am certainly no expert on this.

So having done this recently I thought I’d write this post to help others who need to do this, follow the steps below.

Step 1 – Create a new automation account

If you don’t have an existing Azure Automation Account you will need one so go to the Automation Accounts section in Azure Portal and create one.

If you have an existing Automation Account you can move on to Step 2.

Step 2 – Create Runbook & Add Script

Note: These need to be migrated to Extension Based Hybrid workers by August 2024. However Microsoft provide a simple approach to do this. I haven’t used these yet as I don’t have VMs available to run the workers but we will do this soon, so please bear this in mind.

Under Runbooks in the Automation account click ‘Create a runbook’:

Then and name it something like ‘Sitecore-DB-Maintenance-Plan-Workflow-RB’. Ensure you choose ‘Powershell Workflow’ as the Runbook Type – otherwise the script doesn’t work correctly:

Click on the Runbook you just created and choose ‘Edit in portal’:

Then paste in the script (see below):

This is the script to copy and paste. It’s modified version of the one Richard shared on SSE.
It includes more logging and comments. Note some of the additional logging shows up in the ‘All Logs’ section as is Verbose:

You can test this if you like in the test pane but once you are happy with it publish it.

Step 3 – Create Credentials

Now we need to add our SQL Admin user username and password as Azure Credentials. If you don’t have an existing SQL Admin user you can use then create one which has the access required to rebuild indexes.

Next add an new Credentials under the automation account by clicking ‘Add a credential’:

Add the credentials details like so called ‘DatabaseCred’:

Step 4 – Create Schedules

Now we need to create a schedule for each Sitecore database that we want to Re-Index. This will run the Runbook Workflow script on a schedule.

Under the automation account click ‘Add a schedule’:

Then add the Schedule details. For example the below is for the Master Database.

Sitecore recommend Indexing is done weekly and In my case we want to run it out of hours (3am) and not over a weekend of near a Monday (as that is the busiest day for this client). This may vary for you so adjust accordingly:

Repeat this for each Database you want to Re-Index. I setup schedules for the main databases: Master, Core and Web:

Step 5 – Link Schedules & Set Parameters

Now we need to link the existing Schedules to the Runbook. Go to the ‘Sitecore-DB-Maintenance-Plan-Workflow-RB‘ Runbook and click ‘Link to schedule’:

Then select the Runbook Schedule by clicking ‘Link a schedule to your runbook’:

And select a schedule from those you setup previously at Step 4.

Then click ‘Configure Parameters and run settings’:

Set the parameters like so for the SQLServer, Database and CredentialsName like so. Use the Credentials you setup at step 3:

Step 6 – Set up Logging & Alerts

Under the runbook ‘Logging and tracing’ turn on ‘Log verbose records’ like so:

You can setup alerts if you would like to for errors under the automation account by creating an alert rule and filtering on the Runbook logs:

Step 7 – Test and check Logs

Once the Runbook schedule has run you can check the output under the ‘Jobs’ section of the runbook:

Check the ‘All logs’ section too and you should see more information such as how fragmented the tables were and the number of fragmented tables found:

That’s it, you should now have a working Runbook Workflow that automates the re-indexing and prevents your databases from becoming slow. Hopefully this is useful for others too.

Here are some other useful links that I found to help with this:

https://gist.github.com/ivanbuzyka/70db190d540e34300dab5015f21d00bf

https://github.com/yochananrachamim/AzureSQL/blob/master/AzureSQLMaintenance.txt

https://segovoni.medium.com/automating-azure-sql-database-maintenance-tasks-overview-bdbadcb312bf

https://learnsitecorebasics.wordpress.com/2023/04/30/sitecore-commerce-user-creation-takes-too-long-or-turns-into-timeout-error/

https://devjef.wordpress.com/2017/08/28/running-database-maintenance-on-azure-sql-db-with-azure-automation/

https://learn.microsoft.com/en-us/azure/automation/automation-runbook-output-and-messages

https://learn.microsoft.com/en-us/azure/automation/learn/automation-tutorial-runbook-textual

Bulk Enable/Disable Sitecore Users with SPE

We’re currently pretty close to completing an upgrade to Sitecore 10.3 for a client and during the go live process we needed to disable most of the users apart from a few admin users and then re-enable them again after go-live.

We have a lot of users in the system and so I turned to Sitecore PowerShell Extensions (SPE) to automate this process. Here is the script I came up with:

When you run the script it has a dialog which allows you to select if you would like to enable or disable users and to choose which Admin users you would like Exclude when running the Disable/Enable:

Obviously you don’t want to accidently lock yourself out of Sitecore by disabling the main sitecore\Admin user!, therefore I’ve put a check in for this to try and stop this happening:

Once the script has completed you will see a modal confirming the number of users Disabled/Enabled:

Then you will be shown a report showing a list of all the users that have been either Enabled or Disabled:

Note that as I unchecked the sitecore\testadminuser in the modal dialog it has disabled this user along with all the other non-admin users in Sitecore.

These screenshots are from my local dev environment, but I’ve tested this script on hundreds of users and it runs in a few seconds.

Hopefully it’s useful for others who need to do something similar and can be easily updated too.

Deleting IAR items from the Database & Content Editor warnings for over-written IAR Files

Having recently created an Sitecore 10.3 IAR Package for the Scheduled Publishing Module I needed to remove the files from the database as they were still there even though they are now in the .dat files I created.

In previous versions of Sitecore it was quite tricky to do this but luckily were using Sitecore 10.3 and the Sitecore CLI has been updated to allow us to delete specific items from the database with the itemres cleanup command.

The commands we need to run are as follows:

dotnet sitecore itemres cleanup -p "/sitecore/templates/Scheduled Publish" -r

 dotnet sitecore itemres cleanup -p "/sitecore/system/Tasks/Schedules/ScheduledPublishTask" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/system/Tasks/Commands/ScheduledPublishCommand" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/system/Modules/Scheduled Publish" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/content/Applications/Content Editor/Gutters/Scheduled Publish" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/content/Applications/Content Editor/Ribbons/Strips/Publish/Scheduled Publish" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/content/Applications/Content Editor/Ribbons/Chunks/Scheduled Publish" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/system/Field types/Custom Field Types" -r

It’s possible to run these commands using the ‘what if’ flag (-w), to see what would happen if you ran them which is quite handy for testing them first. You will see a message saying that no changes will be made:


Note that unfortunately It’s not possible to run the ‘what if’ if not providing a path. It seems this might be coming in 10.4:


Once you’ve run the commands properly (without the -w switch) then you will see confirmation that the item(s) were removed like so:

The next step was that I wanted to check the above deletes have worked correctly and that all the items were indeed coming from the IAR files and not from the database.

I decided an Content Editor warning would be a good way of doing this, I have created these using SPE before so had a look around and found this really useful post from Jan Bluemink on doing this for IAR files. It mostly worked ok but the code that was share had some issues with the formatting and I wanted to make some improvements. Here is my updated version:

Note: to use this you need to ensure that your script library is configured as an ‘PowerShell Script Module’, that the integration points for Content Editor Warnings are enabled and the script placed in the correct sub-folder (Warning).

The script displays an Content Editor blue Info message if an item is coming from an IAR file and hasn’t been over-written like so:

And if it has been over-written (is coming from the database) then it shows an orange warning message like so:

This was really useful for confirming that the IAR files were working as expected. I actually set this up before running the cleanup commands above so that I could check I was getting the Orange message initially and then the Blue one after running the cleanup commands.

You can test this yourself if you like by opening this item in Content Editor: /sitecore/system/Marketing Control Panel/Taxonomies/Campaign group

This item comes from the item.master.dat file out of the box.

Another helpful tool is this SPE report that Jan Bluemink created, it lists all over-written IAR file items from a .dat file.

Hopefully this is useful info for anyone else who needs to cleanup IAR files and check the cleanup has worked correctly.

Below are some other useful links I found when working on this:

https://doc.sitecore.com/xp/en/developers/103/sitecore-experience-manager/work-with-items-as-resources.html
https://uxbee.eu/insights/items-as-resources-by-sitecore-part-3
https://jeroen-de-groot.com/2022/01/05/remove-items-from-resource-file/
https://gist.github.com/michaellwest/13e5a49b34340b9ebebdb83ff2166077

Convert Publish files to Sitecore CLI JSON format


I’m currently working on an Sitecore upgrade for a client and this week I needed to upgrade the Scheduled Publishing Module to be compatible with Sitecore 10.3. And whilst the code had been upgraded recently by Nehemiah Jeyakumar, there was no package for it still.

I was really keen to use an Item as Resource File (IAR) version, but to do so I’d need an Sitecore CLI JSON file which I didn’t have. There was however a Package.xml file which was used to create previous Sitecore Packages for the module.

I wondered if I’d be able to use this to create an Sitecore CLI JSON file but couldn’t find anything online that anyone had done. So I decided I’d write a PowerShell Script to do this for me. You can find this below:


The script essentially loads each x-item entry in an Package.xml file, calculates the item path and name and then generates and JSON file in the Sitecore CLI Serialization format and saves it to disk for you.

How to use it

Open the script in PowerShell ISE on your PC and update the 4 variables below.
The $inputPackageFilePath should be your existing package file and $outputJsonFilePath where you would like to save the JSON file. The namespace and description should reflect your module.

variables - UPDATE THESE 4 AS REQUIRED
$inputPackageFilePath = "C:\Projects\SCScheduledPublishing\Packages\Sitecore Scheduled Publish.xml"
$outputJsonFilePath = "C:\Projects\SCScheduledPublishing\Sitecore.Schedule.Publish-module.json"
$namespace = "Sitecore.Schedule.Publish"
$description = "Sitecore.Schedule.Publish Module Items"

Once you have update these variables you can run the script and all being well you will get a JSON file saved out as specified. You should see an output similar to below with a summary of the number of items created and the location of the JSON file:


You should see that the script will automatically work out which database the item should be put into (from the package path) e.g Master or Core.
Note: The script currently sets all items as ‘SingleItem‘ scope and ‘CreateUpdateAndDelete‘ for allowedpushoperations, so you may want to adjust some of these manually.

After that you can just run the serialize command from your Sitecore Solution folder like so:
dotnet sitecore ser pull

And then the run the Item as Resource command to create one or more .dat files with the items included:
dotnet sitecore itemres create -i Sitecore.Schedule.Publish  -o SCScheduledPublishing\src –overwrite


I have a blog post here with some more info on these steps actually (it’s for an older version of the CLI but will work fine): https://www.flux-digital.com/blog/creating-custom-items-resources-sitecore-cli-4-0/

Hopefully this is useful for others who need to do this for a module.

Indecently if you need an Sitecore 10.3 version of the SCScheduledPublishing module you can find an normal and IAR package for 10.3 here: https://github.com/nehemiahj/SCScheduledPublishing/tree/main/Packages

My Sitecore SUGCON 2023 Takeaways – Day 2

> DAY ONE - If you haven't read about Day One you can read it here.

SUGCON DAY 2

IMG_0478

The 2nd Day of SUGCON started bright an early so after a quick breakfast and cup of tea at the hotel I headed down to the first session I’d planned to see.

Rob’s session is one of the key sessions I really didn’t want to miss this year. A few clients I’ve spoken to recently (and other Sitecore Dev’s I’ve chatted to at SUGCON so far) are facing this challenge:

‘How do we move to XM Cloud from XP and what do we need to consider?’

– so I was keen to learn from Rob’s experiences.

Migrating advanced Sitecore implementations to XM Cloud – Rob Habraken

rob-xm-cloud-banner

Rob started with telling us the differences with XM Cloud and explaining how publishing works differently (given you publish to the Edge):

IMG_0638
IMG_0640

Rob then shared a typical XP implementation diagram and showed how XP Differs as integrations and functionality is moved into the head application:

IMG_0642

He then discussed what is included and not included in XM Cloud in detail. Martin shared some similar slides the day before, but I think these were a little clearer so I didn’t include them in the previous post:

IMG_0644
IMG_0645

This was also a pretty cool comparison of XP vs XM Cloud equivalent features:

IMG_0646

Rob then discussed the Migration approach to XM Cloud. There was a lot of really useful info here about things to consider and how to get your project prepared for the migration and how to tackle it:

IMG_0648
IMG_0649
IMG_0650
IMG_0652

IMG_0653

Next up was the the different development approaches and workflow. I’ve talked about these before but I didn’t know much about option 3 at all. I guess most Sitecore developers (especially in a small team) will use option 1, but option 3 is a really good approach for being able to use local content for your development without having to push it to XM Cloud:
IMG_0654
IMG_0655
IMG_0656
IMG_0657

Rob then went on to explain in detail about how Content Resolvers don’t work if they are dynamic and only static ones do. It’s possible to use some out of the box ones or implement your own GraphQL Content Resolver:

IMG_0659
IMG_0662
IMG_0663
IMG_0664

This is an example of Bread crumbs in XM Cloud and a GraphQL search query:

IMG_0666

IMG_0667

Rob finished his talk with a summary of the benefits of XM Cloud. The shift in Development domain and thinking is the tricky part for a lot of Sitecore Developers I feel:
IMG_0668

 

Rendering your data in headless – 101 different ways
– Mike Edwards

mike-headless-banner

I’ve known Mike for a number of years now and he’s always an good speaker so I was looking forward to Mike sharing his learnings from his headless journey.

IMG_0673
IMG_0674

Mike started by lamenting how things used to be easy in the World of MVC and server-side development and then with all the JQuery and JS frameworks things became pretty bloated.

Things have moved on a lot now in FE development though and there are now many different options for building Headless websites in Sitecore, some of these I’m aware of or have experimented with – Others I’ve not heard of, such as ‘Island Architecture’.

IMG_0674
IMG_0676
IMG_0677
IMG_0678

SPAs bring their own set of problems in terms of page load times and indexability so Mike went into Hydration and Partial Hydration techniques and approaches that try to solve these issues:

IMG_0679
IMG_0680

Then Mike explained more about Partial Hydration examples and Island Architecture. Island Architecture lets you create your web app with pure HTML and CSS for all the static content but then add in regions/placeholders of dynamic content to support interactive ‘islands’ of content. Given the rest of the page is static it downloads really quickly and is available to use faster.

IMG_0681
IMG_0683

Mike then covered Resumability, Edge/Serverless and tools such as Storybook and Hydration Payload.

IMG_0684
IMG_0685
IMG_0686
IMG_0687

There are some Challenges and limitations which need to be re-address:

IMG_0689
IMG_0690
IMG_0692
IMG_0693

Finally Mike ended with saying that this is the future and we need to embrace the new world.
IMG_0694

It was a really interesting talk and gave me a lot to think about and research further. The following talks were 15 minute lightning talks until lunch.

Leverage Sitecore Connect for Sitecore CDP – Sarah O’Reilly

IMG_0695

I’d heard a fair bit about Connect but I’ve not really seen much about how it actually works. So I was looking forward to this session

Sarah took us through an example of using Connect to import user segment data from CDP into Google Ads.

IMG_0696 IMG_0703

Once the export was setup to build from CDP the steps were then configured in Connect to sync to Google Ads:
IMG_0712
IMG_0714

There are tons of Apps supported and different recipes defined and it was impressive to see the options for building logic such as if statements / for loops data mapping and manipulation all within Connect.

IMG_0709

This was an insightful session and really interesting to see how it works. I can see how it could be used to help with migrating to XM Cloud from XP or another CMS platform.

Sitecore components explained for your marketers – Ugo Quaisse

The next session was about the Sitecore Components builder in Pages in XM Cloud. I’ve heard a bit about this but not seen much of it in detail. I was hoping to see a full demo of it. I guess at the session was only 15 minutes there wasn’t time, but I still learned quite a bit about how it works.

IMG_0715
IMG_0716
IMG_0717
IMG_0719

The Component Builder can be used without any development or code required at all. First Themes are setup with colours, fonts and breakpoints configured.

Then datasources are setup and mapped from either a url or json or GraphQL.

IMG_0720

Then the components ‘look and feel’ – layout, dimensions and sizing can be configured in the Builder. This looks pretty neat. Then versioning and publishing is setup for the Component.

IMG_0721

IMG_0722

Lastly some details were shared around the benefits for digital creatives, it’s possible to get Sites built very quickly and easily using Components Builder.

IMG_0723
IMG_0724
IMG_0726
IMG_0727

 

Leveraging XM Cloud APIs and Webhooks to powerup integrations – Ramkumar Dhinakaran & Elakkuvan Rajamani

IMG_0740

After lunch it was time for another session, this time on Webhooks. The use-case here was the XM Cloud Lighthouse Integration which would do an automated quality check of pages using Webhooks and report on it.

IMG_0731

IMG_0742
IMG_0745

Depending on the integration required it might not be best to use a Webhook:
IMG_0756
IMG_0758

Quite a lot of detail was shared with how this all works and integrates.

IMG_0735
IMG_0736
IMG_0737
IMG_0747

There were some links and takeaways shared at the end.

IMG_0763

 

Sitecore Search: Real case PoC – Sebastian Winslow & Jesper Balle

IMG_0766

The 2nd to last session for the day was on the Sitecore search (based on Discover) which I was keen to learn about more as I didn’t know much about how it worked.

IMG_0770
IMG_0772
IMG_0773

CEC looks pretty powerful and can be used to manage search, performance is key and widgets can be configured for search and catalog:

IMG_0774
IMG_0776
IMG_0778
IMG_0779

Some dev resources and admin info were shared:

IMG_0780
IMG_0782

The use case for search was a property Site. There is still some features that need to be built.

IMG_0783
IMG_0785
IMG_0789
IMG_0790

Some info was then provided on Triggers to get the content, Request and document extractors to process and manipulate the content.

IMG_0791
IMG_0792
IMG_0793
IMG_0794

Search API endpoints, results response, API Explorer and ability to refine the widgets.

IMG_0796
IMG_0798
IMG_0799
IMG_0801

It’s early days and the search SDK is still not there yet but it’s coming. Be careful with how much content you try and index when testing but there are some significant benefits to using it.

IMG_0803
IMG_0805

This was a really informative session and gave me all the info I was looking for about how to go about implementing search.

Experiences with Content Hub One – Journey of relaunching our Usergroup website – Katharina Luger & Christian Hahn

IMG_0807

Then it was time for my last session of the day on how the Sitecore User Group Germany rebuilt their site as an SPA using Content Hub One.

The slide below was probably the simplest comparison I saw all SUCON of the differences between XM Cloud and Content Hub One.

IMG_0808
IMG_0811
IMG_0816
IMG_0818

There are 7 Steps to component creation:

IMG_0821
IMG_0822
IMG_0823
IMG_0824
IMG_0825
IMG_0826
IMG_0828

Lastly there were some challenges faced.

IMG_0831

This was a really great session and I’m looking forward to working with Content Hub One in the future.

Virtual Closing Keynote by Scott Hanselman

IMG_0833

There was then an really entertaining and insightful talk from Scott Hanselman. He had some great advice, wisdom and stories to tell to us and I think everyone in the room was pretty captivated by his talk.

IMG_0848

With that it was the end of SUCON 2023, there was a big round of applause for all the organisers. These events take a hell of a lot of organising and a real commitment from everyone involved.

 

IMG_0855

It was time to go and have a few beers and reflect on what was a another brilliant SUGCON.

IMG_0862
Hopefully this is useful info for anyone that couldn’t attend this year or had too many beers and forgot what they learned :-).