Tag Archives: Sitecore

Reduce Technical Debt Part 2 – Empty Try Catch

Here is a the second in the series on how to reduce Technical Debt, please read part one as it gives an insight into the scale and challenges we faced, and outlines what this blog post is trying to address.

As you are aware the first part introduced a few code examples to help remove redundant code, this blog will continue to focus on how to remove redundant code by introducing the EmptyTryCatchService class and the IgnoreEmptyTryCatch Custom attribute .

But before that I just briefly want to mention integrations, in my experience this is where a lot of redundant and or unnecessary code can hide.

Integrations

Therefore, an important concept to reduce technical debt, is to identify, separate and isolate dependencies on external systems, especially complex and or legacy systems.

I have already written a blog series about this, so if you missed please read it.

Integrations Platform

I believe in an ideal world, most integrations and especially complex and or legacy system specific code should be move out of the website solution to an integration platform!

Most issues, difficulties, problems and cost relating to code maintenance and technical debt for website is due to being responsible for stuff they should not be.

For example, the website is responsible for aggregation data from several systems to provide a unified view of their data, NO this is the job of an Integrations/aggregation platform

Empty Try Catch

So, let me start by stating – ignoring exceptions is a bad idea, because you are silently swallowing an error condition and then continuing execution.

Occasionally this may be the right thing to do, but often it’s a sign that a developer saw an exception, didn’t know what to do about it, and so used an empty catch to silence the problem.

It’s the programming equivalent of putting black tape over an engine warning light.

It’s best to handle exceptions as close as possible to the source, because the closer you are, the more context you have to achieve doing something useful with the exception.

Ignore Empty Try Catch – Custom attribute

In some rare cases the empty try catch can be valid, in which case you can use the custom attribute to mark the function and explain why it is OK, and check one last time is there not a TryParse version of the function and or code you are calling.

Performance

Slightly off topic, but still a type of technical debt, do not use exceptions for program flow!

Throwing exceptions is very expensive (must dump the registries, call stack, etc. and whilst doing this it blocks all threads) so it has a big impact on performance.

I have seen sites brought to their knees because of the number exceptions being thrown.

Redundant Code

In the solution we took over there were over 300 empty try-catch statements ☹

But how can it hide redundant code?

When an exception is thrown it can jump over lots of code, which is therefore never called.

Therefore, all the code after the exception is redundant.

Below is the classic Hello World program it works as expected, it prints out “Hello World”.

But there is a lot of technical debt, now this might look like a funny example, but I have seen a lot of similar examples in real world, usually with a lot more code in the try catch, and usually found most often around big complex integrations!

try catch redunant code

Solution – EmptyTryCatchService

For empty try catches I would not recommend you use Sitecore’s standard logging, as it can create enormous log files which is enough to kill your sitecore solution, if the empty try catch is called a lot.

For tracking down empty try catches, it is good to have a dedicated log file and a way to limit the amount of data written to the log file.

EmptyTryCatchService class provides the following features:

  • Report interval – the interval between exceptions with the same owner, name and exception message are written to the log file.
  • Max Log limit – when the number exceptions with the same owner, name and exception message is exceed no more data is written to the log file.
  • Dedicated log file for each day
  • Disable all logging via configuration.

EmptyTryCatchService class is a simple class that, relies on the MaxUsageLog for most of its functionality (see the code below).

In addition to finding redundant code the EmptyTryCatchService will track down hidden errors and problems in your solution, which will result in a reduction of the technical debt.

You must be careful when reviewing the exceptions logged and deciding how best to deal with the exceptions. See part 3 in the series, to reduce technical debt.

public class EnsureIsObsoleteService
{
private readonly MaxUsageLog _maxUsageLog =
new MaxUsageLog(10000, "EnsureIsObsoleteService",1000);
public void EnsureIsObsolete(object owner, string message)
{
_maxUsageLog.Log(owner, message);
}
}
public class MaxUsageLog
{

public MaxUsageLog(int maxLogLimit,
string fileNamePrefix,
int reportCountInterval=1000000)
{
_maxLogLimit = maxLogLimit;
_fileNamePrefix = !string.IsNullOrEmpty(fileNamePrefix) ? fileNamePrefix : "MaxUsageLog";
_reportCountInterval = reportCountInterval;
}

public void Log(object owner, string message, Exception ex = null)
{
if (!IsEnabled())
return;

string type = string.Empty;
if (owner != null)
{
if (owner is Type typeObj)
{
type = typeObj.FullName;
}
else
{
type = owner.GetType().FullName;
}
}
string key = GenerateKey(type, message, ex);
if (!ShouldLog(owner, key))
return;
var count = Count(key);
WriteToFile(owner, type, message, ex, count);
}

private int Count(string key)
{
return Usage.ContainsKey(key) ? Usage[key] : 0;
}

private void WriteToFile(object owner, string type, string message, Exception exceptionToLog, int count)
{
try
{
StreamWriter log = File.Exists(FileName) ? File.AppendText(FileName) : File.CreateText(FileName);
try
{
log.AutoFlush = true;
log.WriteLine($"{DateTime.Now.ToUniversalTime()}: Type:'{type}' Message:'{message}' Count:{count}");
if (exceptionToLog != null)
{
log.WriteLine($"Exception:{exceptionToLog}");
}
log.Close();
}
finally
{
log.Close();
}
}
catch (Exception ex)
{
if (!Sitecore.Configuration.ConfigReader.ConfigutationIsSet)
return;
Sitecore.Diagnostics.Log.Error(
$"Failed writing log file {FileName}. The following text may be missing from the file: Type:{type} Message:{message}",
ex, owner);
}
}
private bool ShouldLog(object owner, string key)
{
if (!Usage.ContainsKey(key))
{
Usage.Add(key, 1);
return true;
}
var count = Usage[key] = Usage[key] + 1;

if (count % _reportCountInterval == 0)
{
WriteToFile(owner, "******** Report Count Interval ******", $"Key:'{key}'", null,count);
}

if (count > _maxLogLimit)
return false;
if (count == _maxLogLimit)
{
WriteToFile(owner, "******** Usage Max Exceeded ******", $"Key:'{key}' Max Limit:{_maxLogLimit}",null,count);
return false;
}
return true;
}
private string GenerateKey(string type, string message, Exception ex)
{
return ex != null ?
$"{_fileNamePrefix}_{type}_{message}_{ex.HResult}" :
$"{_fileNamePrefix}_{type}_{message}";
}

private string FileName
{
get
{
DateTime date = DateTime.Now;
string fileName = $@"\{_fileNamePrefix}.{date.Year}.{date.Month}.{date.Day}.log";

if (!Sitecore.Configuration.ConfigReader.ConfigutationIsSet)
return ConfigurationManager.AppSettings[Constants.Configuration.Key.LogFolderForApplications] + fileName;

return Sitecore.MainUtil.MapPath(Sitecore.Configuration.Settings.LogFolder) + fileName;
}
}

private bool IsEnabled()
{
if (!Sitecore.Configuration.ConfigReader.ConfigutationIsSet)
return StringToBool(ConfigurationManager.AppSettings[Constants.Configuration.Key.MaxUsageLogEnabled],false);

return Sitecore.Configuration.Settings.GetBoolSetting(Constants.Configuration.Key.MaxUsageLogEnabled, true);
}

private bool StringToBool(string value, bool defaultValue)
{
if (value == null)
return defaultValue;
bool result;
if (!bool.TryParse(value, out result))
return defaultValue;
return result;
}

private readonly int _maxLogLimit;
private readonly string _fileNamePrefix;
private readonly int _reportCountInterval;

// this is to ensure we can count how many times a message has been logged across all threads
private static readonly Dictionary<string, int> Usage = new Dictionary<string, int>();
}

Hope this was of help, Alan

 

Reduce Technical Debt and Redundant Code

A while ago, we at Pentia took over a massive Sitecore solution, which after 15 years of upgrades and development the maintenance cost consumed the entire digital budget of the customer.

In other words, the client was at a crossroad – to build new or renovate.

For this client the answer was relatively easy:

  • Firstly, the number of features and functionalities in the platform is vast, and just to scope and specify the entire platform was a massive, if not impossible, undertaking – and one which would claim a large number of resources internally and externally.
  • Secondly, while building a new platform (a massive task), the existing platform would have to be kept alive and slowly (painfully slowly) phased out over time. This means double resources for development, maintenance and operations.
  • Thirdly – and probably the most deterring factor – the change management involved in retraining the thousands of staff involved in and around the platform and across departments was substantial and disruptive to the entire organisation.

Therefore, a renovation project was established, and the first task was to reduce technical debt for the solution.

Reducing maintenance cost

One of the best ways to reduce technical debt is to reduce the code base, less code == less maintenance cost. In this case we managed to delete 28% of the code base, here are a few key figures for the solution when we took it over.

  • 900+ sites (over ½ million items)
  • 15 years old (multiple upgrades from Sitecore 4.x to 8.2 and single migration)
  • 15 integrations
  • 600+ Layouts/sub layouts
  • Many JavaScript applications (Angular/React/Backbone/knockout/native/JQuery)
  • Code
    • 294030 lines code
    • Cyclomatic Complexity – main project 9662 average 1200
    • Depth of Inheritance – main project 17 average 8
    • Class Coupling – main project 1400, average 500
  • Single solution multiple roles
    • Content management
    • Content delivery
    • Publishing
    • Utility/API
    • Bot
  • No Access to production (apart from Sitecore client)
  • Manual deploys to Production
  • 2 separate solutions (Intranet & Websites) merged into a single solution 4 years ago
  • Not Helix compliant (sort of n-tier where projects had numbers)

The Challenge

Due to the sheer size of the solution, no one in the client’s organisation really knew which features were used and how much. There were many clear indications of code not being used or referred.

So, the initial task was to identify and remove unnecessary parts of the solution.

But how to you identify redundant code?

Visual studio has tools for that, unfortunately Sitecore/web application introduce additional challenges as un-referenced C# code can still be executed due to the following:

  • Configuration – pipelines, event handlers, custom configuration, etc.
  • Sitecore content – items that define that specific functions on a class should be executed i.e. WFFM.
  • Sitecore rendering engine that renders the presentation using web controls, layouts, sub layouts, controllers, code, etc.

In addition, then we must identify if the code used by the following is ever called

  • Layouts
  • Sub Layouts
  • Controllers
  • Web Controls
  • XSLT’s
  • Rest APi’s
  • Soap Web Services

Solution

As in most renovation projects, there is no silver bullet, it requires a longsighted plan, structured methodology, concepts, code, tools and continuous effort to reduce technical debt.

Ironically to reduce the code base you must introduce more code.

Custom Attributes

We introduced several custom attributes to help mark up the code and help identify issues to be address.

  • Obsolete
  • Used
  • Refactor
  • Ignore Empty Try Catch (see part 2)

Used

The point of this attribute is to clearly mark that a loosely referenced class, method or interface is indeed needed by the solution.

In other words, it indicates that a class, method or interface is used, even though it has no references. It’s possible to add a text to explain how and where it is used.

Obsolete

Whilst .net provides the Obsolete custom attribute; there are some missing options to indicate that the code is obsolete, and can be removed when a condition is met:

  • Specific date
  • Specific release is in production
  • 3rd party system is updated to a specific version

The point of this attribute was therefore to allow us to plan the renovation project in stages and remove code when the referring parts were cleaned up.

Refactor

During this project we ran into many pieces of code, classes and structures which were in dire need of refactoring. But because of constraints in time, code not deployed, lack of knowledge, dependencies, multiple version of 3rd party system, or for some other reason it was not possible at that time.

Therefore, the best we could do was add this attribute and define why it should be refactored, and why it hasn’t been refactored.

The purpose of this attribute was therefore documentation and planning of the renovation process

Introducing a “Ensure Code Is Obsolete” Service

It is very difficult to ensure that code is obsolete and is never called and that is why it is so difficult to delete code.

What we needed was a somewhat conclusive measurement if the running code was being executed.

What we decided to do was to introduce code in the solution that collected data on code executed across all running solution instances and aggregated the data and presented the results, to ensure that the code was not required.

The IIncrementCountService interface was introduced to provide the ability to count how often the code is executed and then send the results to be aggregated with the other instances, by the content management server.

public interface IIncrementCountService
{
  bool IncrementCount(Type type,string name);
}

Implementation Challenges

The Content Management, Content Delivery, Utility & API instances are in different network zones without access to each other.

The implementation must have a minimum impact on performance, network traffic, database storage.

Not introduce any new databases and or tables.

As we do not have access to production environment apart from the Sitecore Client, it is not possible log the data the file system.

Sitecore Remote Events

Remote events (see this blog for a good introduction) provide the perfect mechanism to allow all instances to send their counter data to the Content Management service which is responsible for aggregating the data and presenting the results.

You must be careful with events as if you flood the event queue table it can kill the performance of ALL your sitecore instances.

The following configuration was introduced (see my blog post on Type Safe Settings) so the IncrementCount function will only raises an event when one of the following is true:

  • The count exceeds 1000
  • The threshold of 15 minutes is reached
  • A new day starts

This ensures that the event queue is not overloaded and will minimize performance impact, network & database usage.

<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:environment="http://www.sitecore.net/xmlconfig/environment" xmlns:role="http://www.sitecore.net/xmlconfig/role/"&gt;
	<sitecore>
		<feature>
			<Diagnostics>
				<CounterSettings type="Feature.Diagnostics.Infrastructure.CounterSettings, Feature.Diagnostics" singleInstance="true">
					<ThresholdCount>1000</ThresholdCount>
					<ThresholdTime>15</ThresholdTime>
					<Enabled>true</Enabled>
				</CounterSettings>
			</Diagnostics>
		</feature>
	</sitecore>
</configuration>

The IncrementLocalCountService class is responsible for incrementing the count, caching it locally and raising the event to notify the Content Management server, when one of the afore mention threshold is met.

   public class IncrementLocalCountService : IIncrementCountService
    {
        private readonly IList&lt;Counter&gt; _counters = new List&lt;Counter&gt;();
        private readonly CounterFactory _counterFactory;
        private readonly CounterUpdateRemoteEventFactory _counterUpdateRemoteEventFactory;
        private readonly CounterSettings _counterSettings;

        public IncrementLocalCountService([NotNull]CounterFactory counterFactory,
            [NotNull]CounterUpdateRemoteEventFactory counterUpdateRemoteEventFactory,
            [NotNull]CounterSettings counterSettings)
        {
            Assert.ArgumentNotNull(counterFactory, nameof(counterFactory));
            Assert.ArgumentNotNull(counterUpdateRemoteEventFactory, nameof(counterUpdateRemoteEventFactory));
            Assert.ArgumentNotNull(counterSettings, nameof(counterSettings));
            _counterFactory = counterFactory;
            _counterUpdateRemoteEventFactory = counterUpdateRemoteEventFactory;
            _counterSettings = counterSettings;
        }

        public bool IncrementCount(Type type,string name)
        {
            if (string.IsNullOrWhiteSpace(name))
                return false;
            if (_counterSettings == null || !_counterSettings.Enabled)
                return false;

            DateTime today = DateTime.Now.Date;
            // any from yesterday Flush
            Counter counter = _counters.FirstOrDefault(c =&gt; c.Name == name &amp;&amp; c.Date == today &amp;&amp; c.Type == type);
            if (counter == null)
            {
                counter = _counterFactory.Create(name, today, 0);
                _counters.Add(counter);
            }
            counter.Count++;
            Flush(today);
            return true;
        }

        private void Flush(DateTime today)
        {
            //iterate over all counters, flush that exceed the threshold count or time restriction
            foreach (var counter in GetThresholdExceeded())
            {
                RaiseEvent(counter);
                _counters.Remove(counter);
            }
        }

        private IEnumerable&lt;Counter&gt; GetThresholdExceeded()
        {
            DateTime timeLimit = DateTime.Now.Subtract(new TimeSpan(0, _counterSettings.ThresholdTime, 0));
            return _counters.Where(c =&gt; c.Created &lt; timeLimit || c.Count &gt;= _counterSettings.ThresholdCount).ToList();
        }

        private void RaiseEvent(Counter counter)
        {
            if (counter == null)
                return;
            var counterUpdateRemoteEvent = _counterUpdateRemoteEventFactory.Create(counter.Name, counter.Date, counter.Count);
            Sitecore.Eventing.EventManager.QueueEvent(counterUpdateRemoteEvent,true,true);
        }
    }

Who is responsible for aggregating the results?

The content Management is responsible for aggregating the results. It requires some extra configuration, to register that it will subscribe to handle remote events, raise the event and it then handle the remote event (see blog for more information).

<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:set="http://www.sitecore.net/xmlconfig/set/" xmlns:role="http://www.sitecore.net/xmlconfig/role/"&gt;
	<sitecore role:require="Standalone OR ContentManagement"&gt;
		<events>
			<event name="counter:update:remote">
				<handler type="Feature.Diagnostics.Infrastructure.Events.Counter.CounterUpdateRemoteEventHandler, Feature.Diagnostics" method="Update" />
			</event>
		</events>
		<pipelines>
			<initialize>
				<processor type="Feature.Diagnostics.Infrastructure.Pipelines.Counter.SubscribeToCounterRemoteEventService, Feature.Diagnostics" />
			</initialize>
		</pipelines>
	</sitecore>
</configuration>

The code associated with the configuration above.

    public class CounterUpdateRemoteEventHandler
    {
        public void Update(object sender, EventArgs args)
        {
            if (args == null)
                return;

            try
            {
                var countRemoteEventArgs = args as RemoteEventArgs<CounterUpdateRemoteEvent>;
                Assert.IsNotNull(countRemoteEventArgs, $"Unexpected event args: {args.GetType().FullName}");
                Assert.IsNotNull(countRemoteEventArgs.Event, $"Event is nul: {args.GetType().FullName}");

                var counterRepository = ServiceLocator.ServiceProvider.GetService<CounterRepository>();
                Assert.IsNotNull(counterRepository, $"Could not resolve type:{typeof(CounterRepository).FullName}");

                var counterFactory = ServiceLocator.ServiceProvider.GetService<CounterFactory>();
                Assert.IsNotNull(counterFactory, $"Could not resolve type:{typeof(CounterFactory).FullName}");

                var @event = countRemoteEventArgs.Event;
                var counter = counterFactory.Create(@event.Name, @event.Date, @event.Count);
                if (counter == null)
                    return;
                counterRepository.Update(counter);
            }
            catch (Exception exception)
            {
                Log.Error($"CounterUpdateRemoteEventHandler.Update - failed", exception);
            }
        }
    }

    public class SubscribeToCounterRemoteEventService
    {
        public void Process(PipelineArgs args)
        {
            Sitecore.Diagnostics.Log.Info("SubscribeToCounterRemoteEventService.Initialize Called",this);
            var action = new Action<CounterUpdateRemoteEvent>(RaiseRemoteEvent);
            EventManager.Subscribe(action);
        }

        public void RaiseRemoteEvent(CounterUpdateRemoteEvent counterUpdateRemoteEvent)
        {
            if (counterUpdateRemoteEvent == null)
                return;
            RemoteEventArgs<CounterUpdateRemoteEvent> remoteEventArgs = new RemoteEventArgs<CounterUpdateRemoteEvent>(counterUpdateRemoteEvent);
            Event.RaiseEvent(counterUpdateRemoteEvent.EventName, remoteEventArgs);
        }
    }

Where is the Data Saved?

Ideally it should be saved in its own SQL database.

Unfortunately, we were not allowed to introduce and new databases and or tables, so we had to use the sitecore IDTable. The CounterRepository is responsible for retrieving, updating and  persisting the counters in the IDTable.

    public class CounterRepository
    {
        private readonly CounterFactory _counterFactory;
        private readonly GenerateKeyService _generateKeyService;

        public CounterRepository([NotNull] CounterFactory counterFactory, 
            [NotNull]GenerateKeyService generateKeyService)
        {
            Assert.ArgumentNotNull(counterFactory, nameof(counterFactory));
            Assert.ArgumentNotNull(generateKeyService, nameof(generateKeyService));
            _counterFactory = counterFactory;
            _generateKeyService = generateKeyService;
        }

        public bool Update([NotNull] Counter counter)
        {
            Assert.ArgumentNotNull(counter, nameof(counter));

            var counterInDatabase = Get(counter.Name, counter.Date);
            if (counterInDatabase == null)
                return Add(counter);
            counter.Count += counterInDatabase.Count;
            Delete(counterInDatabase);
            return Add(counter);
        }

        public IEnumerable<Counter> Get()
        {
            var idTableEntries = IDTable.GetKeys(Constants.IdTable.Prefix);
            return idTableEntries == null ? new List<Counter>() : _counterFactory.Create(idTableEntries);
        }

        private bool Add(Counter counter)
        {
            if (counter == null)
                return false;
            var idTableEntry = IDTable.Add(Constants.IdTable.Prefix,
                _generateKeyService.GenerateKey(counter.Name, counter.Date),new ID(Guid.NewGuid()),
                ID.Null,counter.Count.ToString());
            return idTableEntry != null;
        }

        private void Delete(Counter counter)
        {
            if (counter == null)
                return;

            IDTable.RemoveKey(Constants.IdTable.Prefix, _generateKeyService.GenerateKey(counter.Name, counter.Date));
        }

        private Counter Get(string name, DateTime date)
        {
            if (string.IsNullOrWhiteSpace(name))
                return null;

            var idTableEntry = IDTable.GetID(Constants.IdTable.Prefix, _generateKeyService.GenerateKey(name, date));
            if (idTableEntry == null)
                return null;
            if (!long.TryParse(idTableEntry.CustomData, out var count))
                count = 0;
            return _counterFactory.Create(name, date, count);
        }

      }

Presenting the results

No magic here a simple counter.aspx pages, which reads from the CounterRepository and displays it in a table, with the option to clear the database. Also some code to ensure that only Sitecore administrators can access the page. See Part 2 in the series.

Sitecore SolR Sorting Challenge

As I promised in my last post (please read it first) here is a solution to address the SolR sorting issues.

The Problem

The issue is that different pages, usually have different date fields to represent how they should be sorted and if we want to adhere to the Helix principles, the Solr feature must NOT KNOW ABOUT PAGE TYPES.

For example, a news page will have a news date, calendar event might use the start date and an some page will not have a date field and therefore will have to use created and or updated.

Typically, I see solutions that deal with this issue at retrieval time i.e. index all the different fields and then have a specific “order by” clause for each page type.

The biggest disadvantages of this approach is that you cannot sort a list with different page types i.e. get the 10 latest items that are either news, event or articles.
In addition, you have to manage all the different order by clauses. Which will destroy the Indexing/SolR abstraction as you will have to expose the IQueryable<T> in order to apply the order by clause.

Solution

I prefer to deal with the sorting issue at indexing time and have a single dedicated SolR field which is used to sort all item types. This allows you to sort news, articles, calendar events, etc. in the same way.

You still must deal with the issue that the SolR implementation should not know about which field to use for a give item type. To overcome this issue we use a configuration file that defines the mapping between an item of a specific type and which field to use for sorting.

Template to Field Mapping

The following configuration defines which field should be stored for sorting for each item template, if a field mapping is not defined, the item updated value is used.

In sitecore,  it is easy to map the configuration below to a C# class (i.e. SortFieldMappingRepository) for more information, about how to do this see my blog post on Structured, Type Safe Settings in Sitecore.

<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:environment="http://www.sitecore.net/xmlconfig/environment" xmlns:role="http://www.sitecore.net/xmlconfig/role/">
	<sitecore>
		<feature>
			<SolRIndexing>
				<SortFieldMappingRepository type="Feature.SolRIndexing.Infrastructure.ComputedFields.Sorting.SortFieldMappingRepository, Feature.SolRIndexing" singleInstance="true">
					<mappings hint="raw:Add">
						<!--News, NewsDate -->
						<sortFieldMapping templateId="{AE6B4DF2-DF36-4C6D-ABDA-742EE6B85DE9}" sortFieldIdId="{3D43D709-DFAE-4B4F-8CB2-DF80D9B83857}"/>
						<!-- Calander, StartDate-->
						<sortFieldMapping templateId="{A8DD1F59-08AB-4BF0-BE76-8873A8F00628}" sortFieldIdId="{6369AC75-036B-48D8-95E2-F16998F8E777}"/>
						<!-- Video, VideoDate -->
						<sortFieldMapping templateId="{3D9D8B7A-FCB2-459B-908B-1E31F0C975FB}" sortFieldIdId="{E9993C21-1EF0-4C30-83D4-5F69923CEC3E}"/>
						<!-- Article, ModifiedDate Field -->
						<sortFieldMapping templateId="{F6B599F4-11C4-4C65-B253-95F3C40EBA18}" sortFieldIdId="{DC6C0E49-1705-4F3E-80EF-83176E482DBC}"/>
					</mappings>
				</SortFieldMappingRepository>
			</SolRIndexing>
		</feature>
	</sitecore>
</configuration>

Define the SolR index Field

Then we define the SolR index field used for sorting and specify that the SortComputedIndexField class is responsible for adding the sort date to the index.

<sitecore>
	<contentSearch>
		<indexConfigurations>
			<defaultSolrIndexConfiguration>
				<documentOptions>
					<fields hint="raw:AddComputedIndexField">
							<!-- Sorting-->
							<field fieldName="_sort" returnType="datetime" >Feature.SolRIndexing.Infrastructure.ComputedFields.Sorting.SortComputedIndexField, Feature.SolRIndexing</field>

					</fields>
				</documentOptions>
			</defaultSolrIndexConfiguration>
		</indexConfigurations>
	</contentSearch>
</sitecore>

The SortComputedIndexField class is responsible for providing the value for the sort field and it calls the CalculateSortDateService to determine the sort value.

namespace Feature.SolRIndexing.Infrastructure.ComputedFields.Sorting
{
    public class SortComputedIndexField : AbstractComputedIndexField
    {
        private readonly CalculateSortDateService _calculateSortDateService;

        public SortComputedIndexField(CalculateSortDateService calculateSortDateService)
        {
            _calculateSortDateService = calculateSortDateService;
        }

        public SortComputedIndexField()
        {
            _calculateSortDateService = ServiceLocator.ServiceProvider.GetRequiredService<CalculateSortDateService>();
        }

        public override object ComputeFieldValue(IIndexable indexable)
        {
            Item item = indexable as SitecoreIndexableItem;
            if (item == null)
                return null;

            if (!item.Paths.FullPath.StartsWith(Constants.SitecoreContentRoot))
                return null;
            return _calculateSortDateService.CalculateSortDate(item);
        }
    }
}

The CalculateSortDateService class iterates over the field mappings, defined in the configuration and uses the field value for the date if the field is found, otherwise the updated value for the item is used.

namespace Feature.SolRIndexing.Infrastructure.ComputedFields.Sorting
{
    public class CalculateSortDateService
    {
        private readonly SortFieldMappingRepository _sortFieldMappingRepository;

        public CalculateSortDateService([NotNull]SortFieldMappingRepository sortFieldMappingRepository)
        {
            Assert.ArgumentNotNull(sortFieldMappingRepository, nameof(sortFieldMappingRepository));
            _sortFieldMappingRepository = sortFieldMappingRepository;
        }

 
        public DateTime CalculateSortDate([NotNull] Item item)
        {
            Assert.ArgumentNotNull(item, nameof(item));
            var mappings = _sortFieldMappingRepository.Get();
            if (mappings == null)
                return item.Statistics.Updated;

            foreach (var sortFieldMapping in mappings.Where(m => m != null))
            {
                if (item.TemplateID != sortFieldMapping.TemplateId)
                    continue;

                Field dateField = item.Fields[sortFieldMapping.SortFieldId];
                if (dateField == null || string.IsNullOrWhiteSpace(item[sortFieldMapping.SortFieldId]))
                    continue;

                return new DateField(dateField).DateTime;
            }
            return item.Statistics.Updated;
        }
    }
}

Sorting Extensions

The last part is to provide the ability to sort the result set and for this we introduce the SortDateSearchResultItem class and a few extensions methods to add sort ascending & descending.

namespace Feature.SolRIndexing.Infrastructure
{
    public class SortDateSearchResultItem : SearchResultItem
    {
        [IndexField("_sort")]
        [DataMember]
        public virtual DateTime SortDate { get; set; }
    }
}

namespace Feature.SolRIndexing.Infrastructure.ComputedFields.Sorting
{
    public static class SortingQueryableExtensions
    {
        public static IQueryable<T> SortDescending<T>(this IQueryable<T> query) where T : SortDateSearchResultItem
        {
            return query.OrderByDescending(item => item.SortDate);
        }
        public static IQueryable<T> SortAscending<T>(this IQueryable<T> query) where T : SortDateSearchResultItem
        {
            return query.OrderBy(item => item.SortDate);
        }
    }
}

I hope this post will help, Alan

SolR

Introduce a (SolR) Sitecore Search Abstraction

After my previous post on Supporting Integrations, I received a few comments asking why was SolR was in the integration’s module group, as it is part of the sitecore API.

In this blog i will explain why and in more detail how to isolate a SolR integration.

Yes Sitecore Search is part of the Sitecore API, but it relies on an 3rd party system! Please read my previous post about why you need to identify, separate and isolate modules with external dependencies, as Sitecore Search API faces exactly the same challenges.

With the bonus that there are 3 supported implementations (Lucene, SolR and Azure Search) which are almost the same, but not quite!

Sitecore Search Issues

In most of the helix-based solution I have seen indexing is implemented in the framework layer which provides some helper extensions. Then each feature uses indexing module with Sitecore Search API to implement their requirements. This typically leads to the following issues:

  • Duplicated code across features
  • No clear definition of the indexing/constraints/sorting requirements for the solution.
  • Non-consistent implementation across the solution i.e. Predicate builder vs LINQ.
  • Optimization is difficult.

With each feature implement their indexing requirements, it leads to duplicated code as it feature needs to build the query to add sitecore root item, base templates, language etc. for each request, before adding the feature specific part of the query.

Therefore when fixing a bug or performance issues you must track down all the places where Search is used and then determine if they require the same fix and or the optimization.

How to abstract away the SolR Search Implementation

  • Identify the indexing requirements
    • Introduce an abstraction in the foundation layer (Indexing).
  • Create the implementation (Solr Indexing) that implements the abstraction define by Indexing in the foundation layer.
    • Address the sorting issues (i.e. different items templates have different date fields)
  • Let the features use the indexing abstractions (i.e. Course, News, Calendar, etc.)

Identify the indexing requirements

There are 3 main components to define the indexing requirements constraints, pagination & sorting.

Constraints

Constraints define what the filters can be applied to reduce the number of items that are returned. In this example it will be possible to apply the following constraints:

  • Location in tree sitecore (i.e. site specific news folder, all content, etc.)
  • Language (i.e. return items with an English language version)
  • Template, i.e. does the item inherit from a specific template (i.e. news, calendar, etc.)
  • Taxonomy – return items based on their categorization (i.e. football, skiing, etc.)

Pagination

Defines the number of search result per page and which page you require.

Sorting

Is responsible for defining what is used to sort the result items and the direction (ascending or descending), for example using date to get the 10 latest news.

If you want to sort by date, one challenge is to determine how to sort he results, as different pages will have different fields. Some pages have no date apart from created/updated, news normally has a specific news date and calendar events have start/end dates.

The SolR implementation must NOT KNOW ABOUT PAGE TYPES, see my blog post with a solution.

The following code defines the indexing requirements.

public interface IConstraint
{
    Item RootItem { get; }
    Language Language { get; }
    ID BaseTemplate { get; }
    IEnumerable<Category> Categories { get; }
}
public interface IPagination
{
    int Number { get; }
    int Size { get; }
}
public enum SortDirection
{
    Ascending,
    Descending
}

Then we need to define the result of making a search and a repository to make the search

public interface IPagedSearchResult
{
    IEnumerable<Item> Results { get; }
    IPagination Pagination { get; }
    int TotalHits { get; }
    bool HasMoreResults { get; }
}
public interface IPagedSearchResultRepository
{
    IPagedSearchResult Get([NotNull] IConstraint searchConstraint, [NotNull] IPagination pagination, SortDirection sortDirection);
}

The definition of the search result could of been type safe, i.e. return a model of type T instead of the Item, but I wanted to keep the example simple and not use a specific binding framework.

Anyway I hope this post will help, Alan

Sitecore Helix – Supporting Integrations

This blog will outline how it is possible to identify, separate and isolate dependencies on external systems, by introducing an Integrations module group. See here for more information about module groups in Helix.

But why do we need a Integrations module group?

Any feature that requires integration to achieve its purpose, will introduce additional challenges relating to stability and additional system knowledge, than a standard feature layer module.

Stability

Helix is built on a number of principles that help deal with stability, the Stable-dependencies principle and Stable-abstractions principle for more details see my blog post.

Features that rely on 3rd party systems are by nature more unstable than any other feature/foundation modules. As it is not usually possible to control when external systems change, upgrade or fail. Therefore, quality assurance, test, and automated deployment for the website cannot protect against this type of change or failure.

System knowledge

Working with a 3rd party system, such as CRM, ERP, Marketing Automation, DAM, SolR, Custom API, etc. requires additionally system specific knowledge. So it is a good idea to use abstractions to hide the system specific complexities for several reasons:

  1. The web team might not have the specific system knowledge.
  2. The web team should not be distracted by the additional complexities of all the integrations.
  3. The team responsible for implementing the integration may have limited Sitecore knowledge and should not be distracted by the rest of the web solution.
  4. It is good practice to separate modules by their responsibilities by splitting the presentation and the retrieval of data from the external system.

Integrations Module Group

The intention/purpose of the Integrations module group is to clearly define which modules have a dependency on an external system and ensure they are only responsible for that integration to the external system.

Example

This solution is responsible for selling a wide variety of courses. The customer has its own custom course catalog API and a complex legacy enrollment system.

The following diagram shows the module architecture for the solution.

In the foundation layer the following modules were introduced to provide abstractions. If you are not familiar with IoC and Abstractions see my earlier post.

  • Course Catalog
    • Defines the abstraction/system agnostic logical data module for the course catalog.
    • Acts as an abstraction between the website and Custom Catalog API.
    • It helps to focus on the ideal model that supports the business objectives.
  • Enrollment
    • Defines the abstractions to support the process of initiating attendance on a course at a specific school and shopping cart.
    • Acts as an abstraction between the website and the enrollment legacy system.
    • It helps to focus on the ideal model that supports the business objectives.

In the Integrations Group in Features layer the following modules where introduced

  • EXT Course API
    • Responsible for getting data provided by the EXT Course API.
    • Provide the implementation of the Course Catalog (foundation layer) abstractions.
    • Responsible for caching the course catalog, as the API only supports periodical batch retrieval.
  • EPOC Enrollment Management
    • Responsible for integration of the functionality provided by the EPOC Enrollment Management SDK.
    • Provide the implementation of the Enrollment (foundation layer) abstractions.

In the Features layer the following modules where introduced

  • Course
    • Responsible for the presentation of the course catalog, retrieved via the course catalog abstractions defined in the foundation layer.
  • Enrollment
    • Responsible for presentation and controlling the process of initiating attendance on a course and displaying the shopping cart, using the abstraction  defined in the foundation layer.

Additional Bonus

Once the integration code is isolated in a single module and only responsible for the integration, it is easier using dependency injection to achieve the following:

  1. Update external system, as the code to change is clearly defined and separated from the presentation and website logic.
  2. Provide the ability to support more than one version of an integration (i.e. different sites use different version)
  3. Move the integrations modules to an integrations platform, if it is the domain model for the customer business.

I hope this blog post gives you some ideas on how to isolate and remove the complexities introduced by integrations from your Sitecore solution, Alan

In my next post, I will explain why and in more detail how to isolate the SolR integration.

 

 

Structured, Type Safe Settings in Sitecore

This feature seems to be overlooked, so I hope this blog post will draw more attention to this feature, and make it’s use more widespread.

In Sitecore, it is easy to map configuration settings to a C# class, whilst maintaining a structure that adheres to the helix principles, see the config below.

Then the mapped C# class can registered with IServiceCollection, so it can be injected into any class using dependency injection.

<?xml version="1.0"?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:environment="http://www.sitecore.net/xmlconfig/environment">
<sitecore environment:require="Local">
<feature>
<salesforce>
<clientSettings type="Example.Feature.Salesforce.Infrastructure.SalesforceClientSettings, FKCC.Feature.Salesforce" singleInstance="true">
<Username>example@blog.example.com</Username>
<Password>xxxxxxx</Password>
<Token>yyyyyyy</Token>
<CacheExpiry>60</CacheExpiry>
<OrganisationId>1111111</OrganisationId>
</clientSettings>
</salesforce>
</feature>
</sitecore>
</configuration>

Previously settings used to be a long flat list of settings, which if we were lucky were grouped use prefixes in the name attribute to indicate which feature they were used by.

<setting name="Feature.Salesforce.Authentication.Username" value="xxxx@example.com" />
<setting name="Feature.Salesforce.Authentication.Password" value="BestPasswordInTheWorld" />
<setting name="Feature.Salesforce.Authentication.SfToken" value="Its a SF token" />
<setting name="Feature.Salesforce.Authentication.CacheExpiry" value="60" />

Solution

It is now very simple to map structured configuration to a type safe C# class, and it involves 4 simple steps.

Step 1 – Define C# Class

Define the C# class that stores the data defined by the settings in the config file, for this example, we will define some authentication settings for a sales force client.

namespace Example.Feature.Salesforce.Infrastructure
{
public class SalesforceClientSettings
{
public string Password { get; protected set; }
public string Username { get; protected set; }
public string Token { get; protected set; }
public int CacheExpiry { get; protected set; }
public string OrganisationId { get; set; }
}
}

Step 2 – Define the settings

It is not required, but I would recommend following the Helix principles when defining the settings structure i.e.

[Layer]/[Feature Name]/[Settings Name]

The type attribute defines which class (i.e. the one defined in step 1) to map the settings element to.

<?xml version="1.0"?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:environment="http://www.sitecore.net/xmlconfig/environment">
<sitecore environment:require="Local">
<feature>
<salesforce>
<clientSettings type="Example.Feature.Salesforce.Infrastructure.SalesforceClientSettings, FKCC.Feature.Salesforce" singleInstance="true">
<Username>example@blog.example.com</Username>
<Password>xxxxxxx</Password>
<Token>zzzzzz</Token>
<CacheExpiry>60</CacheExpiry>
<OrganisationId>1111111</OrganisationId>
</clientSettings>
</salesforce>
</feature>
</sitecore>
</configuration>

Step 3 – Map the configuration to the C# class

Sitecore makes this so easy, using Factory.CreateObject method, which loads the configuration and maps it to the C# class.

(SalesforceClientSettings) Factory.CreateObject("feature/salesforce/clientSettings", true)

Note: Factory.CreateObject expects that configuration path, is relative to the sitecore configuration, not the complete path.

Step 4 Setup dependency injection.

Register the created class with the IServiceCollection, so we can access the class, where necessary using constructor injection.

namespace Example.Feature.Salesforce.Infrastructure
{
public class ServiceConfigurator : IServicesConfigurator
{
public void Configure(IServiceCollection serviceCollection)
{
serviceCollection.AddSingleton(provider =>
(SalesforceClientSettings) Factory.CreateObject("feature/salesforce/clientSettings", true));

}
}
}

I hope this blog posts, helps you to structure your settings in a more maintainable and coherent structure, Alan

Sitecore SIF NewSignedCertificate – The time period is invalid

Problem – The time period is invalid. 0x80630705

The client certificate for xConnect expired on my developer machine for a solution I was developing. I thought no problem I will get SIF to generate new certificates for the website and xConnect.

Unfortunately when I ran SIF, i got the following error when it was running CreateSignedCert : NewSignedCertificate.


PS>TerminatingError(New-SelfSignedCertificate): "CertEnroll::CX509Enrollment::_CreateRequest: The time period is invalid. 0x80630705 (-2140993787 PEER_E_INVALID_TIME_PERIOD)"
>> TerminatingError(New-SelfSignedCertificate): "CertEnroll::CX509Enrollment::_CreateRequest: The time period is invalid. 0x80630705 (-2140993787 PEER_E_INVALID_TIME_PERIOD)"
Install-SitecoreConfiguration : CertEnroll::CX509Enrollment::_CreateRequest: The time period is invalid. 0x80630705 (-21
40993787 PEER_E_INVALID_TIME_PERIOD)
At D:\Projects\FK.Donki\Sitecore\setup\FKCC-Install-Local-Sc-XP0.ps1:42 char:1
+ Install-SitecoreConfiguration @certParams -Verbose
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Install-SitecoreConfiguration
Install-SitecoreConfiguration : CertEnroll::CX509Enrollment::_CreateRequest: The time period is invalid. 0x80630705 (-2
140993787 PEER_E_INVALID_TIME_PERIOD)
At D:\Projects\FK.Donki\Sitecore\setup\FKCC-Install-Local-Sc-XP0.ps1:42 char:1
+ Install-SitecoreConfiguration @certParams -Verbose
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Install-SitecoreConfiguration


 

Firstly I would like to say thanks to Richard Dzien, from Sitecore as he was super quick via slack to help and identify what the problem.

The issue is caused by the fact that the Trusted root certificates, which had expired. See the image below.

Solution

Part 1 – Delete from Trusted Roots Certificates (Computer Account)

The solution is to delete the certificates form the machine account, which you can do either via the MMC certificates snap in or use power shell. Then run SIF again.

Part 2 – Delete from Trusted Roots Certificates (My User Account)

There was also a copy of the root certificates, as you can see below in my personal Certificate store. which also need to be deleted.

Part 3 – Delete from disk

In addition there can be a copy in C:\Certificates, which also need to be deleted.

Once the certificates are deleted from all locations everything worked. SIF 2 – the root certificates will expire in 10 years so no problem there, once it is released.

I hope this helps, Alan

Bonus help – Certificate not found, when calling xConnect

If you get an error that the xConnect client certificate can not be found in your sitecore log file!

But the certificate is in the store and has not expired!

This could be because the root certificate has expired.

 

 

The RED dot is not Enough

The classic check to ensure a test is running is that the red dot (see image above) on the optimization tab is shown, if so all was good. Unfortunately, that is not true, as it is possible that the red dot is shown, but in fact the tests for the page are not running. Sitecore are currently fixing this issue.

So how do we know if the test is running or not? You have to check that status window, for example below the test is NOT running, as the status window says “No Tests”. Thanks to Alec Orlov  from sitecore for this tip.
If the test is running, the status window will contain the estimated number of days for the test to complete, see below.

So what can cause this issue? Well in this case, the customer had started (Deployed) the test from the “Analytics Testing Workflow” see the image below.

It worked for some Sitecore versions and or with a solution specific patch, but it does not work in general. The workflow is in an internal Sitecore workflow, which should not be used. Please follow the official Sitecore documentation, to start your tests.

Solution

If you have a test that is not running, a common issue is related to the fact that the test item which is stored under /sitecore/system/Marketing Control Panel/Test Lab is not in the correct workflow state (must be in deployed state) and or is not published.

I hope this helps, Alan

Helix and Modular Architecture

Helix is Sitecore’ s code name for Modular Architecture, Helix is composed of 2 main areas:

  • Principles
  • Practical Applications (i.e. how we support/implement/conform to the principles/guidelines)

Unfortunately, many on slack & twitter are focusing on the Practical Applications and not the principles. The architectural principles are more important, than how we support/implement the website; So in this blog I will make a brief introduction to the principles.

Helix/Modular Architecture is primarily based on Packaging Principles. In addition, several concepts have been introduced to help support Packaging Principles:

  1. Layers
  2. Module (referred to a package in Packaging Principles)

Now whilst not strictly part of Modular Architecture – I believe all software development should adhere to SOLID principles.

Packaging principles

Is a way of grouping classes to make them more organized, manageable and maintainable! It helps us understand which classes can be packaged together which is called package (module) cohesion and how these packages should relate with one another called package (module) coupling.

“Building software without packaging, is like trying to build a sand castle one grain at a time” – Uncle Bob

The result of packaging in Helix terminology it is called a Module (not a package); therefore, for the remained of this blog I will use module i.e. Module Coupling/Cohesion etc.

Module Coupling

Module Coupling is the corner stone principle in Modular Architecture and determines how modules relate/depend on each other.

Stable-dependencies principle (SDP)

Depend in the direction of stability – a module should only rely on modules that are more stable than itself.

A stable piece of code is one where its interface does not change over time.

Features are expected to change over time and are less stable, as requirements change and or new requirements occur. Unstable code is not bad but a reality!

stable

Stable-abstractions principle (SAP)

Abstractness should increase with stability. Modules that are maximally stable should be therefore maximally abstract. Unstable modules should be concrete. The abstraction of a module should be in proportion to its stability

Acyclic dependencies principle (ADP)

The dependencies between modules must not form cycles, i.e. no circular references are enforced by Visual Studio for C# but not JavaScript, Sitecore Templates, Query strings, web services etc.

Module Cohesion

The following principles help identify what should be packaged together as a module.

Common-closure principle (CCP)

The classes in a module should be closed together against the same kinds of change. A change that affects a module affects all the classes in that module and no other module.
What changes together, should live together.

Common-reuse principle (CRP)

When you depend on one class in a module you depend on all the classes in that module, not just the one you are using.

Reuse-release equivalence principle (REP)

Essentially means that the module must be created with reusable classes — “Either all the classes inside the module are reusable, or none of them are”. The classes must also be of the same family.

Layers

layers

Layers help by visualizing and enforcing the stable dependency and stable abstraction principles of module coupling. Each layer defines the stability of the modules and the direction of dependency.

Modules in the feature layer should not reference each other. A layer is physically described in your solution by folders in the file system, solution folders in Visual Studio, folders in Sitecore along with namespaces in code.

A layer is physically described in your solution by folders in the file-system, solution folders in Visual Studio, folders in Sitecore along with namespaces in Foundation Layer (previously call framework).

Foundation Layer

This layer is the most stable and contains only modules which are not subject to change, and if they do change it will have implications for all modules, typical foundation modules:

  • Taxonomy
  • Dictionary
  • Indexing

Feature Layer

Modules in this layer resembles the customer domain and need to be flexible, and therefore are more likely to change, typical feature modules:

  • Navigation
  • Search
  • Metadata

Project Layer

The project is the least stable layer and can reference all modules as it is used to aggregate the functionality provided by the feature and foundation layers, typical project modules:

  • Page Types
  • Design
  • Context (Responsible for dependency injection (IoC), of course a feature can internally use DI/IoC)

Module

The result of packaging a number of classes together is called a MODULE. Each module is represented by a single Visual Studio project.

A module divides domain functionality into loosely coupled modules with clear boundaries, where each Module can contain Presentation, Business logic, Sitecore Content (Templates, layouts, setting items, etc.) and Data (Sitecore, SQL, etc.).

module

I hope this blog post was helpful and I plan to do a series on common pitfalls which are usually related to the following 2 issues:

  1. One or more modules needs to reference another module in the same layer (which they should NOT do).
  2. how to identify a module and what should be in it.

Finding Sitecore fields in the inheritance hierarchy website from HELL!

We have all been there we take over a solution we didn’t develop and the complexity, hierarchy and structure of the templates is completely crazy!!!

You have a field, but trying to find the definition and or what template it belongs to is a nightmare.

The Solution How to find any field with 1 click!

Open up DbBrowser (sitecore/admin/dbbrowser.aspx) navigate to the item and click on the field and it takes you directly to the fields definition.

dbbrowser 2

I can’t believe I have used so much time trying to unravel the base Template field to find where a field is defined in sitecore, when there is such a simple solution.

I hope this helps others 🙂