Quantcast
Channel: Yet Another Dynamics AX Blog
Viewing all 96 articles
Browse latest View live

Supporting multiple Enterprise Portals for AX2012

$
0
0
With AX2012 you can setup the SharePoint sites named "Enterprise Portal". These sites run perfectly well on the free SharePoint Foundation version, and it also runs on SharePoint Foundation 2013 if you have the necessary updates. In this post I will discuss some considerations for supporting multiple environments, which might be needed if you want to support some development and testing scenarios in addition to a production environment.

Installing and configuring SharePoint is in many aspects a separate skill set, so I totally get why IT Pros prefer to leave that to dedicated SharePoint consultants. Having said that, if you just need to setup Enterprise Portal for the purpose of supporting Role Centers and giving your AX2012 install a nice looking Home page, then your SharePoint install doesn't have to be too complicated.

I will use Foundation as example, but SharePoint Server 2013 (Standard or Enterprise) obviously works with AX2012 as well. Foundation is free, but does have the necessary features for supporting Dynamics AX Enterprise Portal with it's role centers. If you plan for utilizing Power BI, OData or any of the more advanced features, you should know that upgrading Foundation to Standard or Enterprise is not supported.

Assuming you're starting with a blank server, you can download SharePoint Foundation 2013 with SP1 and begin installing the prerequisites. If any of the prerequisites doesn't install successfully, you can browse through the log file and look for the download URL and try install the failed component manually. I've experienced having to install prerequisites manually before. Eventually, you should have the necessary binaries installed and you are ready for installing SharePoint binaries.

I recommend you do not run the Configuration Wizard just yet. Rather continue with installing the updates for SharePoint. Head over to the overview of updates and download the most recent cumulative update and install it. With the latest updates installed, you are ready to initialize your SharePoint Foundation 2013 Farm.

A couple of points here:
  • Consider the account you are using to install the SharePoint farm. Typically this account is referred to the SharePoint Setup and Farm account, and you use it again to configure the farm and potentially install more SharePoint servers into the same Farm. You may need to share the credentials of this account with other IT Pros, so avoid using your own personal account.
  • Normally you want a dedicated service account for SharePoint. This is an unattended account that has broad permissions on SharePoint. The administration web application will run under this account. 
  • It is also normal to have a dedicated service accounts for several of the various services you can setup in SharePoint, but that is out of scope for this article. 
After having run the Configuration Wizard, you have the option to run the wizard that creates your first SharePoint Site. I recommend skipping this wizard, and instead create the necessary web applications manually.

Before installing the very first Enterprise Portal, I recommend the following:

  • Install AX2012 Client and Management Utilities. Point the configuration to the environment you want to install the first SharePoint site. Both the local configuration and the business connector configuration should be pointing correctly and have working WCF configurations.  
  • From the SharePoint Administration Site, you need to manually create a new "Managed Account". From the Home page, under Security and General Security, you will find "Configure managed accounts" and from there you can register the business connector account as a new managed account. The SharePoint sites running Enterprise Portal needs to run under this managed account. 
  • From the SharePoint Administration Site you also need to start the Claims to Windows Token Service (aka C2WTS). From Home, under System Settings and Servers you will find "Manage Services on Server". Locate the C2WTS and start it from here. If you start this from Services under Windows Administrative Tools (Control Panel) and not from SharePoint itself, the service will be in a faulty state and you'll get in trouble when installing Enterprise Portal. Trust me, I've been there.
Now you should be ready first install of Enterprise Portal on this SharePoint Farm. The following steps can be repeated for each environment you want to support, let it be multiple DEVs or TESTs. Obviously, this limits itself performance wise if your box can't handle the load. So lets begin:
  1. Make sure Dynamics AX local configuration and business connector configuration points to right AOS, and do not forget to refresh and save the WCF configuration to this config.
  2. Create a new Web Application. Each Portal needs to run on its own Web Application and isolated application pool. Give it a good name, both the site and the application pool. Also give the Content Database a correlating and good name. The managed account must be the one you created earlier using the business connector account. I like to put these sites on ports like 81, 82, 83, etc.
  3. When the application is created, you are ready to install Enterprise Portal using AX2012 Setup. On the step where you select Web Application you choose the one you created in step 2. Give the site a good name, like "DynamicsAXDev" or something that makes it easy to understand what environment this site will support. Imagine looking at the URL in the browser and you can easily see from the address what environment you're currently at.
  4. Assuming the installation at step 3 went through successfully,  your next step is to make sure the new site always connects to the right environment. Copy a working AX Configuration file (AXC-file) locally to the server. Make sure it has an updated and correct WCF-configuration in it. I tend to put the file under c:\inetpub\wwwroot\. Give it a good name (like DEV.axc). I know the official documentation says the file can be on a UNC-share, but I never got that to work, so a local file seems to work ok. Finally, you need to put a reference to this file inside the web.config for this particular Web Application. The file is normally located under C:\inetpub\wwwroot\wss\VirtualDirectories\81 (given this application runs on port 81). Open the file in some notepad or text editor and put in a new XML section:



    Now you can be sure the Web Application points to the right environment independently of whatever is changed in the business connector configuration settings on this server. I put the section after the System.Web section.
  5. Finally, I recommend loading the site itself and edit the Site Permissions. You probably want to make sure either Domain Users or some dedicated AD User Group has at least Read permissions. 
You can repeat these five steps for each environment you want to support. How cool is that? 

Now, what if you need to copy the AX2012 data between the environments? Well, that can be a problem, because when you install Enterprise Portal, setup connects to the AOS and adds data to the database. These data include the URL and the unique ID of the site you installed. If you start copying data around, you might end up with multiple environments pointing at the same Enterprise Portal, and this Portal points to just one of the AOSes, and that isn't very helpful. 

We need to fix that! :-)

Use this PowerShell command to identify the unique ID (GUID) for each site:

Get-SPSite http://fancysharepointserver:81/sites/dynamicsaxtest | 
Select -ExpandProperty AllWebs |
where {$_.Url -notmatch "dynamicsaxtest/"} | ft -a ID, Url

This will reveal the ID, and you can copy it over to the following SQL command:

DECLARE 
@EPURL AS VARCHAR(255),
@EPGUID AS VARCHAR(255)

SELECT
@EPGUID = 'e3b7b289-cb17-4c38-8e98-858181af88a5' ,
@EPURL = 'http://fancysharepointserver:81/sites/dynamicsaxtest'

UPDATE EPGLOBALPARAMETERS
SET HOMEPAGESITEID = @EPGUID,
DEVELOPMENTSITEID = @EPGUID
WHERE KEY_ = 0

UPDATE EPWEBSITEPARAMETERS
SET INTERNALURL = @EPURL,
EXTERNALURL = @EPURL,
SITEID = @EPGUID
WHERE COMPANYID = 'DAT'

The URL and GUID above is just examples, and will obviously differ from your environment, but you get the idea. 

Now save the SQL Command and make sure to include it in your routines when copying data from one environment to another. 

With all of this, you should be good to go and able to have multiple SharePoint applications running Enterprise Portals for different environments, all on the same server. 


ExchangeRateEffectiveView not returning cross rates to the Cubes in AX2012

$
0
0
I was asked to assist figuring out why exchange rates wouldn't always be retrieved to the Sales Order Cube in Dynamics AX2012. The solution became a journey through various interesting topics (for me at least):

  • Views in AX
  • View methods
  • Advanced Query range
  • Reverse engineering the SQL behind the views
Starting with the view that did not return all the expected data, we have the SalesOrderCube View. 
Now the Query in itself isn't all that fascinating, but I wasn't aware that you could add ranges across datasources like done in this Query. That is pretty handy!


Notice how the Query uses another View as DataSource. There are plenty of examples of Views and Queries being nested, and this is a powerful way to create results from a rather complex ERP data model. 
Notice also there is a custom Range on ValidFrom and ValidTo. The Ranges compare the Dates from ExchangeRateEffectiveView with the CreatedDateTime from SalesTable.



If we look at the definition of the ExchangeRateEffectiveView we see that ValidTo is a field of type Date. Furthermore we see the field is coming from a View Method. But we know CreatedDateTime on SalesTable is a DateTime.
How can it compare a Date with a DateTime? The answer is that actual date is stored in the database as a datetime where the time part is 00:00:00.000.   


So it compares the values and that works like charm. 

The problem

What happens if you have daily exchange rates in your system? Then your ValidFrom and ValidTo becomes the exact same date and more importantly the same time. This will not work correctly since CreatedDateTime also keeps track of what time on the day a Sales Order was created. 

So let's look at one specific example where we have rates for the 9th of December 2014.


And if we run the Sales Order Cube view, and modify the selected columns so we can see the problem, we will notice that the query is unable to collect the rates. The values from Exchange Rates is NULL.



The Solution

There are probably many ways to solve this, but the solution I went for was to make sure that the ValidTo always returns the time part at the max value, which is 23:59.59.000 (AX doesn't operate on the milliseconds, as far as I know).

So compare the results coming from the ExchangeRateEffectiveView before I apply the change.


By doing some small changes to the validTo View method, I can give the time part a better value.



And the result is that the Sales Order Cube now has the Cross Rates as expected. 


I hope you enjoyed this post. I sure enjoyed solving this challenge.
Here is the method body (ExchangeRateEffectiveView.validTo):

private static server str validTo(int _branchNum)
{
str returnString, fieldString, fieldString2, fieldString3;
boolean generateCode = false;
DictView dictView;

// Axdata.Skaue.04.03.2015
// Fix ValidTo with proper time 00:00:00.000 -> 23.59.59.000 ->
str adFixValidToString = 'DateAdd(ss,-1,DateAdd(d,1,%1))';
// Axdata.Skaue.04.03.2015 <-

dictView = new DictView(tableNum(ExchangeRateEffectiveView));

switch (_branchNum)
{
case 1:
returnString = dictView.computedColumnString('VarToVarBefore', 'ValidTo', FieldNameGenerationMode::FieldList, true);
returnString = strFmt(adFixValidToString, returnString); // Axdata.Skaue.04.03.2015
break;

case 2:
fieldString = dictView.computedColumnString('DenToVarEuroBefore', 'ValidTo', FieldNameGenerationMode::FieldList, true);
fieldString2 = dictView.computedColumnString('DenToVarEuroBefore', 'FixedStartDate1', FieldNameGenerationMode::FieldList, true);
// Axdata.Skaue.04.03.2015 ->
fieldString = strFmt(adFixValidToString, fieldString);
fieldString2 = strFmt(adFixValidToString, fieldString2);
// Axdata.Skaue.04.03.2015 <-
generateCode = true;
break;

case 3:
fieldString = dictView.computedColumnString('DenToDenBefore', 'ValidTo', FieldNameGenerationMode::FieldList, true);
fieldString2 = dictView.computedColumnString('DenToDenBefore', 'FixedStartDate1', FieldNameGenerationMode::FieldList, true);
fieldString3 = dictView.computedColumnString('DenToDenBefore', 'FixedStartDate2', FieldNameGenerationMode::FieldList, true);
// Axdata.Skaue.04.03.2015 ->
fieldString = strFmt(adFixValidToString, fieldString);
fieldString2 = strFmt(adFixValidToString, fieldString2);
fieldString3 = strFmt(adFixValidToString, fieldString3);
// Axdata.Skaue.04.03.2015 <-
returnString = 'CASE when ' + fieldString + '<= ' + fieldString2 +
' and ' + fieldString + '<= ' + fieldString3 + ' then ' + fieldString +
' when ' + fieldString2 + '<= ' + fieldString3 + ' then ' + fieldString2 + ' - 1 ' +
' else ' + fieldString3 + ' - 1 end';
break;

case 4:
returnString = dictView.computedColumnString('SameFromTo', 'ValidTo', FieldNameGenerationMode::FieldList, true);
returnString = strFmt(adFixValidToString, returnString); // Axdata.Skaue.04.03.2015
break;

case 5:
returnString = '\'21541231\'';
break;

case 6:
returnString = '\'21541231\'';
break;

case 7:
returnString = dictView.computedColumnString('DenToVarAfter', 'ValidTo', FieldNameGenerationMode::FieldList, true);
returnString = strFmt(adFixValidToString, returnString); // Axdata.Skaue.04.03.2015
break;

case 8:
returnString = dictView.computedColumnString('DenToVarAfterRecipical', 'ValidTo', FieldNameGenerationMode::FieldList, true);
returnString = strFmt(adFixValidToString, returnString); // Axdata.Skaue.04.03.2015
break;

case 9:
returnString = dictView.computedColumnString('VarToDenAfter', 'ValidTo', FieldNameGenerationMode::FieldList, true);
returnString = strFmt(adFixValidToString, returnString); // Axdata.Skaue.04.03.2015
break;

case 10:
returnString = dictView.computedColumnString('VarToDenAfterRecipical', 'ValidTo', FieldNameGenerationMode::FieldList, true);
returnString = strFmt(adFixValidToString, returnString); // Axdata.Skaue.04.03.2015
break;

case 11:
returnString = '\'21541231\'';
break;

case 12:
fieldString = dictView.computedColumnString('DenToDenAfterStart1', 'ValidTo', FieldNameGenerationMode::FieldList, true);
fieldString2 = dictView.computedColumnString('DenToDenAfterStart1', 'StartDate2', FieldNameGenerationMode::FieldList, true);
// Axdata.Skaue.04.03.2015 ->
fieldString = strFmt(adFixValidToString, fieldString);
fieldString2 = strFmt(adFixValidToString, fieldString2);
// Axdata.Skaue.04.03.2015 <-
generateCode = true;
break;

case 13:
fieldString = dictView.computedColumnString('DenToDenAfterStart1Recipical', 'ValidTo', FieldNameGenerationMode::FieldList, true);
fieldString2 = dictView.computedColumnString('DenToDenAfterStart1Recipical', 'StartDate2', FieldNameGenerationMode::FieldList, true);
// Axdata.Skaue.04.03.2015 ->
fieldString = strFmt(adFixValidToString, fieldString);
fieldString2 = strFmt(adFixValidToString, fieldString2);
// Axdata.Skaue.04.03.2015 <-
generateCode = true;
break;

case 14:
fieldString = dictView.computedColumnString('DenToDenAfterStart2', 'ValidTo', FieldNameGenerationMode::FieldList, true);
fieldString2 = dictView.computedColumnString('DenToDenAfterStart2', 'StartDate1', FieldNameGenerationMode::FieldList, true);
// Axdata.Skaue.04.03.2015 ->
fieldString = strFmt(adFixValidToString, fieldString);
fieldString2 = strFmt(adFixValidToString, fieldString2);
// Axdata.Skaue.04.03.2015 <-
generateCode = true;
break;

case 15:
fieldString = dictView.computedColumnString('DenToDenAfterStart2Recipical', 'ValidTo', FieldNameGenerationMode::FieldList, true);
fieldString2 = dictView.computedColumnString('DenToDenAfterStart2Recipical', 'StartDate1', FieldNameGenerationMode::FieldList, true);
// Axdata.Skaue.04.03.2015 ->
fieldString = strFmt(adFixValidToString, fieldString);
fieldString2 = strFmt(adFixValidToString, fieldString2);
// Axdata.Skaue.04.03.2015 <-
generateCode = true;
break;

}

if (generateCode)
{
returnString = 'CASE when ' + fieldString + '<= ' + fieldString2 + ' then ' + fieldString +
' else ' + fieldString2 + ' - 1 end';
}

return returnString;
}

Reduce SSRS deployment time for static reports in AX2012

$
0
0
Are you wasting minutes deploying and redeploying static SSRS reports in all the languages provided with AX2012? If you only need a handful of them, you might just as well consider disabling the licenses for the unwanted languages. You can enable them back if you need them later.

Now disabling one language at a time manually might not be your cup of tea, so I would like to share a small job that disables all languages except the ones you want to keep enabled. Just create a new job and paste in the code below. Use at own risk of course, take backups and backups of the backups etc (you know the drill).


// Remove licence codes for unwanted languages
static void AdLanguageRemover(Args _args)
{
SysConfig sysConfig;
SysRemoveLicense remLic;

Query query;
QueryBuildDataSource qbd;
QueryBuildRange qbr;
QueryRun queryRun;

FormRun confirmForm;
Set languagesToKeep = new Set(Types::String);
Set licenseCodeSet = new Set(Types::Integer);
SetEnumerator it;
int confCount = 0;
boolean licenseChanged = false;
Args args = new Args(formStr(SysLicenseCompareForm));
boolean proceed = false;
SysLicenseCodeDescription codeDescription;
str currentLanguageId;
int pos, sysConfigId;

// List of languages to keep. Add, remove, change to fit your preference
languagesToKeep.add('nb-no');
languagesToKeep.add('en-us');
languagesToKeep.add('sv');
languagesToKeep.add('nl');
languagesToKeep.add('fr');
languagesToKeep.add('da');
languagesToKeep.add('de');

query = new Query();
qbd = query.addDataSource(tableNum(sysConfig));

qbr = qbd.addRange(fieldNum(SysConfig,ConfigType));
qbr.value(enum2Value(ConfigType::AccessCodes));

qbr = qbd.addRange(fieldNum(SysConfig,Id));
qbr.value(SysLicenseCodeReadFile::rangeLanguage());

queryRun = new QueryRun(query);

delete_from remLic;

while (queryRun.next())
{
if (queryRun.changed(tableNum(sysConfig)))
{
sysConfig = queryRun.get(tableNum(sysConfig));
}

codeDescription = SysLicenseCodeReadFile::codeDescription(sysConfig.Id);
pos = strFind(codeDescription,'(',strLen(codeDescription),-strLen(codeDescription));
currentLanguageId = subStr(codeDescription,pos+1,strLen(codeDescription)-pos-1);

if (!languagesToKeep.in(currentLanguageId))
{
warning(strFmt('Removing language %1',SysLicenseCodeReadFile::codeDescription(sysConfig.Id)));
licenseCodeSet.add(sysConfig.Id);
remLic.clear();
remLic.LicenseCode = sysConfig.Id;
remLic.Description = SysLicenseCodeReadFile::codeDescription(sysConfig.Id);
remLic.insert();
}
else
{
info(strFmt('Keeping language %1',SysLicenseCodeReadFile::codeDescription(sysConfig.Id)));
}

}

if (licenseCodeSet.elements())
{
// if not valid code, then we should display the warning
confCount = SysLicenseCodeReadFile::findConfigKeysFromLicenseCodeSet(licenseCodeSet);

confirmForm = classfactory.formRunClass(args);
confirmForm.init();
confirmForm.run();
confirmForm.wait();

if (confirmForm.closedOk())
{
it = licenseCodeSet.getEnumerator();
while (it.moveNext())
{
sysConfigId = it.current();

update_recordSet sysConfig
setting value = ''
where sysConfig.id == sysConfigId;

}

SysLicenseCodeReadFile::codesModified();
}
}
}

Allow for a synchronization to run through after the licenses are modified. Remember that this may impact the database schema, but if you really do not want the (ie.) Norwegian language to be enabled, it should be safe to disable. Thanks for reading!

Fixing firstDayOfWeek and firstWeekOfYear in AX2012

$
0
0
I was asked to have a look at why the date picker in AX still chose Sunday while the user expected Monday. I remember fixing this back in AX2009, so I was curious to see how this was solved in AX2012. The Global class and the methods firstDayOfWeek and firstWeekOfYear caught my attention. One of them used the current users language as key to find the best possible calendar setting, while the other picked the default system setting. Well, let us rewrite this and make it work a bit better, and since sharing is caring - here is how I solved it.

Global::firstWeekOfYear served as an inspiration, but I want it to differentiate between whatever languages my environment is serving. I can live with one cached result for each language, and the performance penalty is low and acceptable.



static int firstWeekOfYear()
{
#WinAPI
SysGlobalCache cache = classfactory.globalCache();
int clientFirstWeekOfYear;
anytype calendarWeekRuleValue;
// Axdata.Skaue ->
/*
str language;
*/
str language = currentUserLanguage();
// Axdata.Skaue <-
System.Globalization.CultureInfo userCulture;
System.Globalization.CalendarWeekRule calendarWeekRule;
System.Globalization.DateTimeFormatInfo userDateTimeFormat;

// Axdata.Skaue ->
/*
if (cache.isSet(classStr(Global), funcName()))
*/
if (cache.isSet(classStr(Global), funcName() + language))
// Axdata.Skaue <-
{
// Axdata.Skaue ->
/*
clientFirstWeekOfYear = cache.get(classStr(Global), funcName());
*/
clientFirstWeekOfYear = cache.get(classStr(Global), funcName() + language);
// Axdata.Skaue <-
}
else
{
// Axdata.Skaue ->
/*
language = currentUserLanguage();
*/
// Axdata.Skaue <-
userCulture = new System.Globalization.CultureInfo(language);
userDateTimeFormat = userCulture.get_DateTimeFormat();
calendarWeekRule = userDateTimeFormat.get_CalendarWeekRule();
calendarWeekRuleValue = CLRInterop::getAnyTypeForObject(calendarWeekRule);

switch(calendarWeekRuleValue)
{
case CLRInterop::getAnyTypeForObject(System.Globalization.CalendarWeekRule::FirstDay) :
clientFirstWeekOfYear = 0;
break;
case CLRInterop::getAnyTypeForObject(System.Globalization.CalendarWeekRule::FirstFullWeek) :
clientFirstWeekOfYear = 1;
break;
case CLRInterop::getAnyTypeForObject(System.Globalization.CalendarWeekRule::FirstFourDayWeek) :
clientFirstWeekOfYear = 2;
break;
}

// Axdata.Skaue ->
/*
cache.set(classStr(Global), funcName(),clientFirstWeekOfYear);
*/
cache.set(classStr(Global), funcName() + language,clientFirstWeekOfYear);
// Axdata.Skaue <-
}

return clientFirstWeekOfYear;
}


Using the same ideas, I changed Global::firstDayOfWeek. Again, I allowed for one cached result for each language.

static int firstDayOfWeek()
{
// Axdata.Skaue ->
/*
System.Globalization.DateTimeFormatInfo fi;
*/
int dow;
str language = currentUserLanguage();
System.Globalization.CultureInfo userCulture;
System.Globalization.DateTimeFormatInfo userDateTimeFormat;
// Axdata.Skaue <-

SysGlobalCache cache = classfactory.globalCache();
int clientFirstDayOfWeek;

// Axdata.Skaue ->
/*
if (cache.isSet(classStr(Global), funcName()))
*/
if (cache.isSet(classStr(Global), funcName() + language))
// Axdata.Skaue <-
{
// Axdata.Skaue ->
/*
clientFirstDayOfWeek = cache.get(classStr(Global), funcName());
*/
clientFirstDayOfWeek = cache.get(classStr(Global), funcName() + language);

}
else
{
// Axdata.Skaue ->
userCulture = new System.Globalization.CultureInfo(language);
userDateTimeFormat = userCulture.get_DateTimeFormat();
dow = userDateTimeFormat.get_FirstDayOfWeek();
/* Removed
fi = new System.Globalization.DateTimeFormatInfo();
dow = fi.get_FirstDayOfWeek();
*/
// Axdata.Skaue <-

// The .NET API returns 0 for sunday, but we expect sunday to
// be represented as 6, (monday is 0).
clientFirstDayOfWeek = (dow + 6) mod 7;

// Axdata.Skaue ->
/*
cache.set(classStr(Global), funcName(),clientFirstDayOfWeek);
*/
cache.set(classStr(Global), funcName() + language,clientFirstDayOfWeek);
// Axdata.Skaue <-
}

return clientFirstDayOfWeek;
}

So, for those of you who have an environment supporting potential multiple calendar setups, I recommend applying the fix above, or write your own fix. If you know a more efficient and better way, please comment below.

Error during install of Microsoft Report Viewer 2012

$
0
0
When installing the client for AX2012 R3 CU8 or later, one of the requirements is installing Report Viewer 2012. The prerequisite Validation step will inform you the component is missing, and it will provide the download link.

However, depending on your scenario you might need to download and install Microsoft System CLR Types for SQL Server 2012.

The error is "Setup is missing an installation prerequisite".


You can download the necessary components from Microsoft SQL Server 2012 Feature Pack.

Or you can download it directly using the following links:
32bit http://go.microsoft.com/fwlink/?LinkID=239643&clcid=0x409
64b http://go.microsoft.com/fwlink/?LinkID=239644&clcid=0x409


    Compiler warnings when using AXBUILD

    $
    0
    0
    When you run AXBUILD with AX2012 R3 you might notice there are some classes that throws warnings, even though you know they are not customized. Why is that?

    Here is an example running CU10 and the following classes are flagged for investigation.

    *** Bad element: \\Classes\AssetBookBonusMethod_IN
    *** Bad element: \\Classes\PCImportModel
    *** Bad element: \\Classes\VendInvoicePostBatchCleanup



    The reason these classes throws warnings is the SysObsoleteAttribute where you can force compiler warnings with the second parameter.


    There are system classes marked as obsolete and while you could argue they should be removed, there may be dependencies to them - out there somewhere -still.

    Login failed error while processing OLAP cubes

    $
    0
    0
    I was requested to help with solving a processing error from SQL Server Analysis Services today.

    While processing the engine fails to retrieve data from the Dynamics AX transaction database and throws the following error:
    Login failed for user "DOMAIN\SERVERNAME$". Reason: Could not find a login mathcing the name provided (CLIENT: )"
    (DOMAIN refers to the actual Active Directory Domain Name and SERVERNAME$ refers to the name of the server.



    From the error it seems like the machine account is unable to login. Now in my scenario the SSAS service runs on the same server as the SQL Server Engine and the instance with the Dynamics AX database. In addition the SSAS service is NOT running using a dedicated Domain Service Account, but rather a local service account. The error then becomes sort of misleading if you read off the account name, because it is actually referring to machine account name.

    The solution is simple, though.

    Open the SQL Server Configuration Manager, find the Analysis Service you are using when processing the OLAP.


    Open the properties and copy out the "Account name" which the service runs under. Normally this would be something like "NT Service\MSOLAP$INSTANCE_NAME" (INSTANCE_NAME refers to the name of the SSAS instance).


    Now open SQL Server Management Studio, open Security and Logins. Add a new Login and paste in the Account name from previous step as the Login name.



    Before you save this new login, open "User Mappings" and find the Dynamics AX database you are trying to use as source for your OLAP. Tick the checkbox in front of the database name and grant the login "db_datareader" role membership.


    Now save the Login and jump back to the processing step and try run it again.

    Reporting Server Subscription does not run on Schedule

    $
    0
    0
    I've setup a SSRS report that pulls data from two SQL Server databases across two different SQL Server instances. It runs under a SQL Server user and works like charm when run manually. However, when I try to setup a Subscription for this report, the report never runs according to the schedule. I don't even get an error in the Reporting Services log.

    In order to understand what is going on, I need to go back to the SQL Server Agent and check the logs there. When you create a Report Server Subscription on a Schedule it creates a new Job for this Schedule and this Job initiates the Event that triggers the scheduled report(s) to run.You can read more about it here.

    First I need to identify what Job ID is behind the schedule I want to investigate. The Jobs are created using a Unique ID (GUID) and in order to link the actual report to its schedule ID I run this SQL on the Reporting Server Database (Normally called "ReportServer_INSTANCENAME"):

    select s.Description, us.UserName, s.LastStatus, s.LastRunTime, c.Path, uc.UserName, rs.ScheduleId,
    from ReportServer.dbo.Subscriptions s
    join ReportServer.dbo.Catalog c on c.ItemID = s.Report_OID
    join ReportServer.dbo.ReportSchedule rs on rs.SubscriptionID = s.SubscriptionID
    join ReportServer.dbo.Users uc on uc.UserID = c.ModifiedByID
    join ReportServer.dbo.Users us on us.UserID = s.OwnerId

    Going back to the Job, I can see from the Job History it doesn't even get to the first step of the Job. Since this is AX, the Job is run under the same account as the Business Proxy Account. That is how SSRS normally is configured in relation to AX. The error from the history log actually says "the owner of job does not have server access". The job is actually set to run in the context of a user that does have server access, but somehow (a bug maybe) this is ignored by the SQL Server Agent.

    The solution is:
    1. Change owner of the job to the SQL Server User that has the necessary permissions to run the report
    2. Grant the SQL Server User permissions to operate on the the Report Server Database (It does needs permissions to inject data, however I gave it full db_owner permissions)
    3. Test the job by doing a manual run
    If the job still fails, you can investigate the Job history for any errors. 

    Stored Procedure for listing who is running AX queries on the SQL Server

    $
    0
    0
    I want to share a small nugget I've created for listing who is currently running queries against AX at any point in time. It builds on two prerequisites, one of which is from one of my suggested answers on the Dynamics AX Community Forum and secondly a solution provided by Microsoft in their own blog.

    For the sake of completeness I will include the "sp_lock3" stored procedure in this post, but it is only fair to include that I got it from the LearnSQLForum, posted by a Scott Wigham here:
    http://forums.learnsqlserver.com/SqlServerTopic40.aspx

    The sp_lock3 looks like this:


    /*******************************************************
    Source:
    http://forums.learnsqlserver.com/SqlServerTopic40.aspx
    ******************************************************/

    USE master
    GO
    IF ( SELECT OBJECTPROPERTY(OBJECT_ID('sp_lock3'), 'IsProcedure')) = 1
    DROP PROC dbo.sp_lock3
    GO
    CREATE PROC dbo.sp_lock3 (
    @spid1 INT = NULL /* Check only this spid; if this is NULL then all spids will be checked */
    , @spid2 INT = NULL /* and this spid; if this is not null, @spid1 must be not null as well */
    )
    AS

    CREATE TABLE #locktable (
    spid SMALLINT
    , loginname NVARCHAR(128)
    , hostname NVARCHAR(128)
    , dbid INT
    , dbname NVARCHAR(128)
    , objId INT
    , ObjName NVARCHAR(128)
    , IndId INT
    , Type NVARCHAR(4)
    , Resource NVARCHAR(16)
    , Mode NVARCHAR(8)
    , Status NVARCHAR(5)
    )

    SET NOCOUNT ON

    IF @spid2 IS NOT NULL AND @spid1 IS NULL
    SET @spid1 = @spid2

    DECLARE @object_id INT,
    @dbid INT,
    @DynamicSql NVARCHAR(255)

    /***** @spid1 is provided so show only the locks for @spid1 and @spid2 *****/
    IF @spid1 IS NOT NULL
    INSERT #locktable ( spid, loginname, hostname, dbid, dbname, objId, ObjName, IndId, Type, Resource, Mode, Status )
    SELECT CONVERT (SMALLINT, l.req_spid)
    , COALESCE(SUBSTRING (s.loginame, 1, 128), '')
    , COALESCE(SUBSTRING (s.hostname, 1, 128), '')
    , l.rsc_dbid
    , SUBSTRING (DB_NAME(l.rsc_dbid), 1, 128)
    , l.rsc_objid
    , ''
    , l.rsc_indid
    , SUBSTRING (v.name, 1, 4)
    , SUBSTRING (l.rsc_text, 1, 16)
    , SUBSTRING (u.name, 1, 8)
    , SUBSTRING (x.name, 1, 5)
    FROM master.dbo.syslockinfo l JOIN master.dbo.spt_values v
    ON l.rsc_type = v.number
    JOIN master.dbo.spt_values x
    ON l.req_status = x.number
    JOIN master.dbo.spt_values u
    ON l.req_mode + 1 = u.number
    JOIN master.dbo.sysprocesses s
    ON l.req_spid = s.spid
    WHERE v.type = 'LR' AND x.type = 'LS' AND u.type = 'L' AND l.req_spid in (@spid1, @spid2) and l.rsc_dbid not in (32767)

    ELSE /***** @spid1 is not provided so show all the locks *****/
    INSERT #locktable ( spid, loginname, hostname, dbid, dbname, objId, ObjName, IndId, Type, Resource, Mode, Status )
    SELECT CONVERT (SMALLINT, l.req_spid)
    , COALESCE(SUBSTRING (s.loginame, 1, 128), '')
    , COALESCE(SUBSTRING (s.hostname, 1, 128), '')
    , l.rsc_dbid
    , SUBSTRING (DB_NAME(l.rsc_dbid), 1, 128)
    , l.rsc_objid
    , ''
    , l.rsc_indid
    , SUBSTRING (v.name, 1, 4)
    , SUBSTRING (l.rsc_text, 1, 16)
    , SUBSTRING (u.name, 1, 8)
    , SUBSTRING (x.name, 1, 5)
    FROM master.dbo.syslockinfo l JOIN master.dbo.spt_values v
    ON l.rsc_type = v.number
    JOIN master.dbo.spt_values x
    ON l.req_status = x.number
    JOIN master.dbo.spt_values u
    ON l.req_mode + 1 = u.number
    JOIN master.dbo.sysprocesses s
    ON l.req_spid = s.spid
    WHERE v.type = 'LR' AND x.type = 'LS' AND u.type = 'L' and l.rsc_dbid not in (32767)

    /**********************************************************************************************
    Because the locks exist in any database, you must USE before running OBJECT_NAME

    We use a dynamic SQL loop to loop through each row from #locktable

    A temp table is required here since SQL Server 2000 cannot access a table variable when issuing dynamic sql
    **********************************************************************************************/
    -- Initialize the loop
    SELECT TOP 1 @dbid = dbid, @object_id = ObjId FROM #locktable WHERE Type ='TAB' AND ObjName = ''

    WHILE @dbid IS NOT NULL
    BEGIN
    SELECT @DynamicSql =
    'USE ' + DB_NAME(@dbid) + char(13)
    + 'UPDATE #locktable SET ObjName = OBJECT_NAME('
    + CONVERT(VARCHAR, @object_id) + ') WHERE dbid = ' + CONVERT(VARCHAR, @dbId)
    + ' AND objid = ' + CONVERT(VARCHAR, @object_id)

    EXEC sp_executesql @DynamicSql

    SET @dbid = NULL -- TSQL preserves the "old" value unless you initialize it to NULL
    SELECT @dbid = dbid, @object_id = ObjId FROM #locktable WHERE Type ='TAB' AND ObjName = ''
    END

    SELECT * FROM #locktable
    WHERE objname NOT LIKE '#locktable_____%' -- don't return this temp table
    AND objid > 100 -- do not return system table locks
    AND objname <>'spt_values'
    GO

    If you first install this procedure, simply by running it on the SQL Server instance, it will be globally available on that instance.

    Next step is to run the script below to install my "sp_whoInAx":


    /************************************************************************************
    sp_whoInAx This script lists out current users running queries against
    any Dynamics AX database on this SQL Server Engine instance.
    Please report any issues and improvements back to the author.

    Witten By: Tommy Skaue (email: add the @-sign between first and last name, and end with .com)
    Microsoft Dynamics AX MVP
    yetanotherdynamicsaxblog.blogspot.com

    Version: 1.0

    Comments: This procedure requires sp_lock3 in order to work.

    This script is presented "AS IS" and has no warranties expressed or implied!!!
    **********************************************************************************/

    USE [master]
    GO
    IF ( SELECT OBJECTPROPERTY(OBJECT_ID('sp_whoInAx'), 'IsProcedure')) = 1
    DROP PROC [dbo].[sp_whoInAx]


    CREATE PROC [dbo].[sp_whoInAx]
    AS

    CREATE TABLE #LOCKTABLE (
    SPID SMALLINT
    , LOGINNAME NVARCHAR(128)
    , HOSTNAME NVARCHAR(128)
    , DBID INT
    , DBNAME NVARCHAR(128)
    , OBJID INT
    , OBJNAME NVARCHAR(128)
    , INDID INT
    , TYPE NVARCHAR(4)
    , RESOURCE NVARCHAR(16)
    , MODE NVARCHAR(8)
    , STATUS NVARCHAR(5)
    )

    INSERT INTO #locktable
    EXEC sp_lock3

    CREATE TABLE #CurrentAxSessions (
    CI VARCHAR(128)
    , HOST_NAME VARCHAR(128)
    , SESSION_ID smallINT
    , DATABASE_ID smallINT
    , LOGIN_TIME datetime
    , STATUS VARCHAR(30)
    )

    INSERT INTO #CurrentAxSessions
    SELECT
    CAST(CONTEXT_INFO AS VARCHAR(128)) AS CI
    , HOST_NAME
    , SESSION_ID
    , DATABASE_ID
    , LOGIN_TIME
    , STATUS
    --, STATUS, CPU_TIME,MEMORY_USAGE, TOTAL_SCHEDULED_TIME, TOTAL_ELAPSED_TIME
    --, LAST_REQUEST_START_TIME, LAST_REQUEST_END_TIME, READS, WRITES, LOGICAL_READS, OPEN_TRANSACTION_COUNT
    FROM SYS.DM_EXEC_SESSIONS
    WHERE 1=1
    AND PROGRAM_NAME LIKE '%DYNAMICS%'
    AND CAST(CONTEXT_INFO AS VARCHAR(128)) <>''

    SELECT
    AX.CI
    ,AX.LOGIN_TIME
    ,AX.SESSION_ID
    ,AX.STATUS
    ,LT.HOSTNAME
    ,LT.DBNAME
    ,LT.OBJNAME
    FROM #CurrentAxSessions AX
    INNER JOIN #locktable LT ON LT.DBID = AX.DATABASE_ID AND LT.SPID = AX.SESSION_ID

    GO

    It takes the result of sp_lock and combines it with the "AX sessions". Again, it does require that registry change mentioned in Microsofts blog, in addition to a restart of the AOS service.

    To reiterate the steps for including the user in the session context, they are:

    1. Navigate to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\Dynamics Server\6.0\01\Original (installed configuration). The last key, Original (installed configuration), is the key name for the current server configuration. If your system uses a different configuration that the original installed configuration, navigate to the currently active configuration.
    2. Create a string registry value called ‘connectioncontext’ and set the value to 1.
    3. Restart the AOS.
    Now you can run sp_whoInAx from any query window on that SQL Server instance, and see from the context the user id and also see what database and tables are involved. 

    Kicking off a dedicated podcast with my friend, colleague and fellow MVP Fredrik

    $
    0
    0
    It is with great excitement I am happy to announce the Dynamics AX Podcast!

    It is a common effort with MVP Fredrik Sætre to generate another channel where all you Dynamics AX geeks, both techies and funkies (yea, functional consultants) can tap into thoughts, ideas, tips, insights, fun AX stories, and the list goes on and on.

    Bare in mind, we invite anyone who is willing to share their insights, and while we don't necessary have to agree on anything, the primary aim is to share thoughts and have fun!

    I really hope you enjoy it, and please feel free to comment, either here on this blog, or on the Youtube channel videos. Feel free to Tweet questions and topics you'd like to be discussed.

    The first Podcast is available here:

    Using PowerShell to list the 10 most recent KBs installed in AX2012 R3

    $
    0
    0
    I was asked to get an overview of the most recent KBs installed on an AX2012 R3 environment.
    One way to do this is using PowerShell.
    We know each KB is deployed as a single model, and we know each model installed will have a unique ID. This ID is incremented automatically by the system, so why not just sort on it, in a descending fashion.

    Look at the following PowerShell command:

    axmodel | 
    where {$_.Name -match 'KB'} |
    sort modelid -Descending |
    select modelid, displayname, version, description, details -first 10 |
    out-gridview

    I assume the default environment picked up by the command "axmodel" (alias for Get-AXModel) is the one we want to query. You can put in a "-Verbose" after the command if you'd like to see some verbose information about what ModelStore database the command operates on.

    I then add a filter on the list, making sure I only get models having a match on "KB" in the Name property. I could also have looked for the word "hotfix" in the Description property. The string is not case sensitive.

    I then pipe the result to a sorting where I want the list sorted descending on the ModelId.

    I select out only some of the columns, and I also pick out the first 10 of the result.

    Finally I send the result to the GridView Window, just because I like to see the result in a nice window (which allow for resizing, column resizing, filtering, etc).

    Notice also that I can have this entire command on multiple lines, and the trick is that PowerShell will allow this when you use Pipe (|) like I am doing above. Otherwise, a line-break is interpreted as execution of a command, so be aware of that when running a "multi-line" PowerShell command.

    You can also use the example above to do other things, like looking for specific KBs and check if they are already installed.

    Enjoy!

    Using the On-Premise Gateway to connect to your AX2012 data to Power BI Portal

    $
    0
    0
    PowerBI has been around for a long time by now, so there are tons of information out there on how to connect your data sources to the powerful PowerBI Portal (www.powerbi.com). Now, getting all the moving parts to connect properly might have been difficult at times, but I'm making this post to just reassure you it is currently very easy to set up.

    Before I begin, I just want to add a precaution:
    Consider the implications around security and performance when setting this up.

    I prefer to use a common service (or setup) account for this, and not my own consultant login. This makes it a little easier if someone else needs to step in and maintain the setup. Furthermore, it allows for the customer to lock down the credentials after I've completed the setup.
    As for performance, you should pay attention to how data refresh adds load to your servers, both the one hosting the gateway itself, the server hosting the data source (SQL Server and/or Analysis Services). You don't want to cause a full system overload while pulling data from your sources.

    I will use the standard Dynamics AX SSAS OLAP as an example, but the point here is less the data source, and more how easy it is to connect to the PowerBI Portal.

    Before we begin, I want to list some prerequisites, or at least how I would set it up:

    • You are using a dedicated setup account and this account is a domain user
    • You are local admin on the server where you plan to setup the gateway. Basically, your setup account is listed in the Administrators Group (under Computer Management, Local Users and Groups, Groups, Administrators).
    • You have access to the SQL Server Analysis Services (SSAS) with your setup account. Check by right-click SSAS instance, choose Properties and look at the list of users under Security.
    • You have a user who is Global Admin in Azure AD. This could be the setup user, synced to Azure AD from the On-Premise domain, but it's not necessary. The point is this user will have access to setup things on PowerBI which currently requires Office 365 Global Admin rights. This may change in the near future, hopefully.
    Given all of the above, you'll simply start by logging on the PowerBI portal using the Office 365 Global Admin user, and download what's called the "Data Gateway". The download link is in the top and takes you to the download page. Press Download and get the latest and finest version.




    When you run this installer, it will ask you to login using the Office 365 Global Admin user (which will have access to register the gateway). Also, I am using the "Enterprise Gateway" option when installing. This allows me to schedule refresh from data sources based on SSAS.
    The gateway has its own set of prerequisite software, so have a look at those before you begin.

    When the gateway is installed successfully, it now can be utilized to connect to ANY of the SSAS instances on the domain, given the network traffic is allowed and you connect with a user who has access to the SSAS instance. So your LIVE, TEST, DEV, and so on. How cool is that?

    Next you would use the PowerBI Admin Portal to configure the Gateway and add your data sources.
    Head over to the Manage gateways and click "Add Data Source".



    Fill in the form. Notice I am using the name of the server where SSAS is running and the name of the SSAS instance. I also use the domain user who has access to the SSAS Server itself. I also put in the name of the OLAP, Dynamics AX Initial.



    The data source should connect and confirm everything looks good for you to connect the data source and whatever it contains. Great!
    A lot of people get here fine, but the next part is something which was added just recently, well actually some months ago in the 2016 April update.

    Why is this update important?

    Given the scenario where you're trying to connect some on-premise SSAS with PowerBI in the cloud, who's to say you're fully synchronizing on-premise Active Directory with Azure Active Directory? What if your local domain doesn't map the users perfectly with the usernames in Azure AD? This is where the "Map User Names" comes into play. We can actually add string replace rules to the usernames, so if your users are not perfectly mapped between Azure AD and On-Premise domain, you can still get this to work.

    So in this example, I will assume the On-Premise domain is using a different domain name compared to the one used by Office 365 and Azure AD. On-Premise I imagine CONTOSO is actually fully qualified as contoso.eu.local, while in the cloud users are using contoso.com.

    Click the Data Source you need to be mapped. Right now, these settings are not shared across data sources, but hopefully they will add further administrative improvements to this.
    Open the list of Users and look at the bottom for the Map User Names button.



    This will slide in the setup for mapping of user names.



    Notice in my example I am replacing the long username for the powerbiadmin@contoso.com with service-account-with-access-to-ssas@contoso.eu.local. So anytime I am logged in at the PowerBI portal with this powerbiadmin-user, and I try to access the data sources through the gateway, the user principal names will be "washed" through the mapping, and "magically" the credentials for that user will work On-Premise because the local domain sees a user it recognizes. Furthermore, I added another example of a user who locally is represented by u12345@contoso.eu.local, while in Azure AD is actually tommy@contoso.com. So if this user also tries to update or refresh data sources, the credentials will work locally.

    What next?

    Well, you can click "Get Data", select "Database" and choose "SQL Server Analysis Services" and simply pick your preferred cube from one of your datasources and click "Connect". With the new dataset in place, you can schedule a refresh outside regular business hours. Like this:





    A couple of follow-up questions:

    Q) What happens if I map two completely different users, who actually both exists both in Azure and On-Premise?
    A) You're the admin and while there are no features to prevent potential illogical mappings, you can map yourself into complete chaos - at your own or someone else despair.

    Q) Do I need to map all users like this? 
    A) Since the mapping is a simple string replace, you can replace similar parts of the username. Like replacing "@contoso.com" with "@contoso.eu.local". If you're lucky enough, this will be enough to fix most usernames. Also consider there may be a number of users who only will load the Reports, but who do not need access to actually reload the datasets with fresh data from the data sources. Surely, those users do not need to be mapped.

    Q) How much time does it take to set this up?
    A) With some practice, and if the users are setup with permissions like described in the beginning of this post, I bet you can get this up, connected and working within the hour. The rest will be waiting for data to come through so you can start fill your beautiful reports and dashboards with powerful data.

    Q) What if it fails horribly and I get stuck? :'-(
    A) Use the community forum and make sure to tag your question with "BI".

    Managing your Azure Subscriptions created through CSP portal

    $
    0
    0
    Let me start off with a disclaimer, as Microsoft may change the behavior, which would render this post obsolete. In which case I'll try to come back and make the necessary amendments. 

    If you have worked with managing your Azure resources through PowerShell, you will notice that Azure Subscriptions created through the Cloud Solution Partner (CSP) portal behaves slightly different. This post from august 2016 goes into details on how to migrate from "traditional" Azure Subscriptions to "CSP" Subscriptions.

    In my post, I want to just quickly show you some key points.

    Azure Portal navigation

    One thing you will quickly notice is that if you access the CSP portal and open the Azure Portal from there, all of the classic resource types in Azure are completely hidden. You can only create and operate on Azure Resource Manager (ARM) types of resources. So basically, this prevents you from using Azure Service Management API and any interface that assumes ASM, or "Classic Azure" as it is also named.

    Another thing you'll notice is that if you try to navigate the Azure Portal directly (portal.azure.com) you do not necessarily see the Azure Subscriptions from your tenants in the list of Directories. I say "necessarily" because if your user has been explicitly granted "owner" role on the tenant, that is a different story. One of the core features of the CSP program, is that the partner already is "owner" through the Foreign Principal role, more specifically all users who have the AdminRights permissions within the CSP portal. You can read more about that here.

    So on order to navigate to the customers Azure resources you need to explicitly go the the tenant through the URL. That will open the tenants context and off you go. The URL will typically be something like this: https://portal.azure.com/TENANTNAME.onmicrosoft.com (or the customers own domain, if it is fully setup.

    Azure PowerShell management

    What about PowerShell? Is that any different? YES!

    If you run Login-AzureRmAccount without setting a context, you'll end up only seeing Azure Subscriptions you have access to explicitly. And even for Azure Subscriptions created through CSP will behave differently.

    The solution is rather easy, even if you could argue it's a bit cumbersome.
    You need to explicitly set the context.

    Here are some options available:

    • You either explicit login to the tenant and subscription:
      Login-AzureRmAccount -TenantId TENANT-GUID -SubscriptionId SUBSCRIPTION-GUID
    • Or login "normally" and then run select with tenant and subscription:
      Select-AzureRmSubscription -TenantId TENANT-GUID -SubscriptionId SUBSCRIPTION-GUID
    • Or you could login and set context using the following command:
      Get-AzureRmSubscription -TenantId TENANT_GUID -SubscriptionId SUBSCRIPTION-GUID | Set-AzureRmContext

     If you do not set the context explicitly, you will not be able to operate over the Azure resources.

    Now, some readers may have noticed Azure Subscriptions created through CSP is inaccessible in the old Classic Azure Portal, which in turn disconnects such the Subscription from being available on Lice Cycle Services (LCS). LCS does support ARM by now, so I believe the solution should be just around the corner. We're just missing one minor piece for all of this to work together properly.

    Have a nice Christmas holiday, everyone!

    Error when installing Reporting Extensions for AX2012 R3 on SQL 2014 SSRS

    $
    0
    0
    Hi

    I could not really find any posts on this out there, so I decided to just share this here.

    You may experience installation errors when you try to install Reporting Extensions for AX2012 R3 on SQL 2014. The setup crashes internally, rolls back the installation and fails.
    In the installation log you will see the error "Version string portion was too short or too long".

    The solution is available on LCS as a downloadable hotfix KB3216898 (released 10th of January 2017) here:
    https://fix.lcs.dynamics.com/Issue/Resolved?kb=3216898&bugId=3800976

    Unpack the content of the hotfix and slipstream it as part of your AX2012 R3 installation and run the installation again. Now it will work.

    Just to make this sure people find this post if they search for the errors, I'll add the full call stack below:

    An error occurred during setup of Reporting Services extensions.
    Reason: Version string portion was too short or too long.
    System.ArgumentException: Version string portion was too short or too long.
      at System.Version..ctor(String version)
      at Microsoft.Dynamics.AX.Framework.Reporting.Shared.SrsWmi.get_ReportManagerUrl()
      at Microsoft.Dynamics.Setup.ReportsServerInstaller.GetOrCreateServerConfiguration(String instanceName, String sharePointServiceApplicationSite, Boolean& createdConfiguration)
      at Microsoft.Dynamics.Setup.Components.ReportingServicesExtensions.InstallConfigurationFiles(String instanceName, String sharePointServiceApplicationSite, Boolean& createdConfiguration)
      at Microsoft.Dynamics.Setup.Components.ReportingServicesExtensions.RunReportingSetupManagerDeploy()







    Error when installing Reporting Extensions for AX2012 R3 on SQL Server 2016

    $
    0
    0
    On my previous post I wrote on installing Reporting Extensions for AX2012 R3 on SQL Server 2014. In this post I want to emphasize that the same hotfix for SQL 2014 is needed for SQL 2016.
    The error behaves slightly different on SQL 2016 if you do not have the patch. The setup experience simply crashes during install and while the components is ticked as "installed" next time you run setup, it is only "half-baked". You need to start over, this time with the hotfix ready.

    Here is a screenshot of the installer crash with "AxSetup.exe has stopped working". Ignore that it is on the SSAS step, I simply chose to install both extensions at the same time. The error actually relates to Reporting Extensions.



    And if you open the setup logs for further inspection, you will see it ends while trying to setup the SSRS bits. Here is an excerpt from the install log:

    2017-07-05 11:30:56ZSetting the SQL Server Reporting Services service account to the Microsoft Dynamics AX .Net Business Connector account.
    2017-07-05 11:30:56ZGenerating database rights script.
    2017-07-05 11:30:56ZOpening connection to the database using the connection string server=SERVER\INSTANCE;Integrated Security=SSPI.
    2017-07-05 11:30:56ZWriting the database rights script to C:\Users\USERID\AppData\Local\Temp\3\tmpADC0.tmp.
    2017-07-05 11:30:56ZExecuting database rights script.


    I got this error even though the installation was slipstreamed with CU12, which is a later version, compared to the hotfix.

    So if you're planning on installing these bits for SQL 2016 (or SQL 2014), do yourself the favor of downloading the KB3216898 and slipstream your install by extracting it into your installation Update folder.

    Here is the link, again: https://fix.lcs.dynamics.com/Issue/Resolved?kb=3216898&bugId=3800976

    Excel Applet not loading when working with D365 Office Add-in

    $
    0
    0
    This post is somewhat related to the post by Ievgen (AX MVP) on the Office Word integration not working with Dynamics 365 for Finance and Operations (Enterprise Edition).

    If you try to connect Excel to an existing environment using the Microsoft Dynamics Office Add-in and all you see is the text "Load applets" after signing in, then it might very well be because the applets needs to be initiated from within the environment.

    If you click the little flag at the bottom right, you will be able to open the messages and see the error "No applet registrations found".



    Solution is simple. Open the D365 environment in your favorite browser (assuming your favorite browser is on the compatible list - hehe) and either search for the form (ie type in "Office") or navigate directly through System Administration, Setup and Office app parameters


    If you see an emply grid, then the settings have not been initialized and that is the problem. Most likely you are missing settings for Office apps in general, so go ahead and initialize all parameters for all the grids accordingly.


    Head back to Excel and try make it reload the applets (simply try add a trailing slash to the url). Hopefully you should get the expected result.

    Consider changing your password Pop Up

    $
    0
    0
    Currently the machines deployed through LCS runs with the Account Policy that passwords has a 42 days age. Interestingly you should not change the password for servers deployed according to this statement of guidelines.

    So if you get annoyed by the reminder to change the password and do not plan to republish the box any time soon, why not go ahead and get rid of the pop up.


    Click the start button and type in "local". You should find the Local Security Policy Console.



    From there it is just a matter of changing the expiration of the password to something other than 42, or simply set it to 0 for "never expire".



    Quick and easy.

    Alternatively you can use a Command Prompt (Run as Admin) with the statement:

    net accounts /maxpwage:unlimited

    Initial steps to troubleshoot failed environment servicing

    $
    0
    0
    On the topic of patching and updating an existing D365 Operations environment I will refer to the online documentation.
    There are also some great community posts that aims to help you, and you may want to check out.
    I expect more posts to show up. As it is of writing this, installing updates can be a bit tedious and cumbersome. 

    I will use this post to share a recent update that failed on a Platform Update. A Platform Update is expected to be a fairly straight forward and safe operation. You simply import the update to your assets in LCS and apply it to your environment (assuming you're running your environment in the cloud). I will not discuss On-Premise in this post. 

    I had an environment running application 1611 with Platform Update 7. I was trying to install Platform Update 10. After it failed on several attempts, I started to try investigate why it failed. 



    Here are the steps I took.

    1) Identify which step failed. In my case it was step 13. (Not exactly my lucky number)



    2) Find the runbook output (normally under C:\RunbookOutput) and find the PowerShell Script that fails. I simply searched the log for "13"

    3) Open PowerShell ISE in Admin mode and open the PowerShell Script. You will find the script in the J:\DeployablePackages folder, and you can match the GUID from the log with the Runbook folder. The Scripts will be located in a standardized folder path.


    4) Run the script and let it fail. From there you can add breakpoints and try run again and step through to try see why it failed. Use whatever you find as information when you contact Microsoft Support. Some updates fails, but should not fail, and it is important that anyone with support agreements make sure to report findings back to Microsoft. 

    Now, in my particular case, the script did not fail when I ran it manually. It succeeded. I can only guess to why that is the case, but after going back to LCS and letting the update "Resume" it eventually finished with all the upgrade steps successfully. 

    In any case, the initial steps above can help you push through a failing update and potentially lead you to the answer why an update unexpectedly failed. 

    Importing users in D365 Operations using Excel

    $
    0
    0
    Let me start off by admitting I was initially thinking about naming this post "Importing users in AX7 using Excel", so there, now this post suddenly became a little bit more "searcher friendly".

    In this post I will show how easy you can connect to your Dynamics 365 Operations instance using Excel. Before I begin the post, let me just remind you that importing users from Azure Active Directory is perhaps easier and quicker. So this post is just to show you it is also possible to import users using Excel with the Dynamics Office Add-in.

    You may have seen the Data Entity "System User" (SystemUserEntity), and you may have tried using it to add users with it, and  furthermore you may also have seen the error "A row created in data set SystemUser was not published. Error message: Write failed for table row of type 'SystemUserEntity'. Infolog: Error: Error in getting SID."


    You will get the same error through Excel if you do not provide some additional columns and information while trying to create a user through that Data Entity.

    You can either start off with opening Excel, install the Dynamics Office Add-in and connect it to the target instance. Or you can open the list of users directly on the instance, and open the list in Excel from there. Either way you should start with a view where you have the System User list in your spreadsheet.

    The next step is to modify the Design of the view. Click the Design link first.



    Then edit the System User table.



    Then add the following two columns: (AccountType) and Alias (Email).



    Save the design changes and ensure you update the view so the added columns are populated with data.

    You will notice the Type (AccountType) and Alias (Email) carry important information for how the user authenticates, in addition to the Provider column. With these columns properly populated, you should be able to add multiple rows and hit a single "Publish" from within Excel.

    Given this, you can have two open Excel instances, and connect to two different instances. And then copy over users from a source to a target using Excel. As long as all the columns are available and in the same order, of course.

    This post should also give you some clue to how you can use Data Management to populate a system with users through a Data Package, if that is your preference.

    Use Azure Automation to start and stop your VMs on a schedule

    $
    0
    0
    This post is long overdue, and I have been meaning to post it over a year ago.  I did present an early version of this script at the AXUG event in Stuttgart, but since then the API has changed around tags, and also it has become very easy to solve authentication using Run as Accounts. The code I am sharing here works on the latest version of the modules, and I hope it will keep working for years to come.

    I few notes before I continue:
    • I base this script off from Automys own code, and it is heavily inspired by the commits done by other users out there in the community. I will refer to the project on GitHub where you will find contributors and authors. 
    • I've only tested and used the script for ARM Resources
    • I removed references to credentials and certificates and it relies on using "Run As Account". Setting it "Run As Account" in Azure is very easy and quick to do.
    You will find the Feature branch here:

    Setup

    I recommend starting by creating a new Automation Account. Yes, you can probably reuse an existing one, but creating a new account does not incur additional costs, and you can get this up and running fairly quick and easy just by following the steps in this blog post.



    Make sure you select "Yes" on the option of creating "Azure Run as Account". Let it create all the artifacts and while you wait you can read the rest of this post.

    When the Automation account is up and running, the next step is to create a new Runbook of type "PowerShell" - just straight up PowerShell, and no fancy stuff.

    Then you grab the script from my feature branch based off the original trunk. You can either take the script from this post, or take the latest from GitHub. I probably won't maintain this blog post on any future updates of the script, but I might maintain the one on GitHub. I'll put a copy down below.

    So with the script added as a PowerShell Runbook, and saved. Now you need to Schedule it. This is where a small cost may incur, because it is necessary to set the Runbook to run every hour. Yes - every hour. Using Automation for free only allow for a limited number of runs, and with the Runbook running every hour throughout the day, I believe it will stop running after 20 days - per month. There is a 500 minute limit per month for free, but the cost incurred when you exceed this is extremely low.

    With the script running every hour you are ready to schedule "downtime". And this is easy.
    You basically just either TAG the VM or the Resource Group holding a collection of VMs.

    By TAG I mean you type on the downtime you want for your resource in the VALUE of a specific TAG. The script looks for a tag named "AutoShutdownSchedule". Example of value would be "20:00->06:00, Saturday, Sunday", and you can probably guess when the server will be shutdown with that value... That is correct, all weekdays between 8 pm at night and 6 am in the morning. You can imagine the flexibility this gives.

    Added Features

    In addition, the script is inspired by other nice ideas from the community, like providing a TimeZone for your schedule, just to ensure your 8 pm is consistent to when the script interprets the value.

    Another feature added is the ability to use a "NeverStart" value keyword, to enforce the resource does not start. You can use this to schedule automatic shutdown that does not trigger startup again after the schedule ends. Example is the value "20:00->21:00,NeverStart". This would stop the resource at 8 pm, and when the RunBook runs again at 9 pm, the resource will not start even though the schedule has ended.

    Finally, I want to comment the added feature of disabling the schedule without removing the schedule. If you provide an additional tag with the name "AutoShutdownDisabled" with a value of Yes/1/True. This means you can keep the schedule and temporarily disable the shutdown schedule altogether.

    The script

    <#
    .SYNOPSIS
    This Azure Automation runbook automates the scheduled shutdown and startup of resources in an Azure subscription.

    .DESCRIPTION
    The runbook implements a solution for scheduled power management of Azure resources in combination with tags
    on resources or resource groups which define a shutdown schedule. Each time it runs, the runbook looks for all
    supported resources or resource groups with a tag named "AutoShutdownSchedule" having a value defining the schedule,
    e.g. "10PM -> 6AM". It then checks the current time against each schedule entry, ensuring that resourcess with tags or in tagged groups
    are deallocated/shut down or started to conform to the defined schedule.

    This is a PowerShell runbook, as opposed to a PowerShell Workflow runbook.

    This script requires the "AzureRM.Resources" modules which are present by default in Azure Automation accounts.
    For detailed documentation and instructions, see:

    CREDITS: Initial version credits goes to automys from which this script started :
    https://automys.com/library/asset/scheduled-virtual-machine-shutdown-startup-microsoft-azure

    .PARAMETER Simulate
    If $true, the runbook will not perform any power actions and will only simulate evaluating the tagged schedules. Use this
    to test your runbook to see what it will do when run normally (Simulate = $false).

    .PARAMETER DefaultScheduleIfNotPresent
    If provided, will set the default schedule to apply on all resources that don't have any scheduled tag value defined or inherited.

    Description | Tag value
    Shut down from 10PM to 6 AM UTC every day | 10pm -> 6am
    Shut down from 10PM to 6 AM UTC every day (different format, same result as above) | 22:00 -> 06:00
    Shut down from 8PM to 12AM and from 2AM to 7AM UTC every day (bringing online from 12-2AM for maintenance in between) | 8PM -> 12AM, 2AM -> 7AM
    Shut down all day Saturday and Sunday (midnight to midnight) | Saturday, Sunday
    Shut down from 2AM to 7AM UTC every day and all day on weekends | 2:00 -> 7:00, Saturday, Sunday
    Shut down on Christmas Day and New Year?s Day | December 25, January 1
    Shut down from 2AM to 7AM UTC every day, and all day on weekends, and on Christmas Day | 2:00 -> 7:00, Saturday, Sunday, December 25
    Shut down always ? I don?t want this VM online, ever | 0:00 -> 23:59:59


    .PARAMETER TimeZone
    Defines the Timezone used when running the runbook. "GMT Standard Time" by default.
    Microsoft Time Zone Index Values:
    https://msdn.microsoft.com/en-us/library/ms912391(v=winembedded.11).aspx

    .EXAMPLE
    For testing examples, see the documentation at:

    https://automys.com/library/asset/scheduled-virtual-machine-shutdown-startup-microsoft-azure

    .INPUTS
    None.

    .OUTPUTS
    Human-readable informational and error messages produced during the job. Not intended to be consumed by another runbook.
    #>
    [CmdletBinding()]
    param(
    [parameter(Mandatory=$false)]
    [bool]$Simulate = $false,
    [parameter(Mandatory=$false)]
    [string]$DefaultScheduleIfNotPresent,
    [parameter(Mandatory=$false)]
    [String] $Timezone = "W. Europe Standard Time"
    )

    $VERSION = '3.3.0'
    $autoShutdownTagName = 'AutoShutdownSchedule'
    $autoShutdownOrderTagName = 'ProcessingOrder'
    $autoShutdownDisabledTagName = 'AutoShutdownDisabled'
    $defaultOrder = 1000

    $ResourceProcessors = @(
    @{
    ResourceType = 'Microsoft.ClassicCompute/virtualMachines'
    PowerStateAction = { param([object]$Resource, [string]$DesiredState) (Get-AzureRmResource -ResourceId $Resource.ResourceId).Properties.InstanceView.PowerState }
    StartAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'start' -Force }
    DeallocateAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'shutdown' -Force }
    },
    @{
    ResourceType = 'Microsoft.Compute/virtualMachines'
    PowerStateAction = {
    param([object]$Resource, [string]$DesiredState)

    $vm = Get-AzureRmVM -ResourceGroupName $Resource.ResourceGroupName -Name $Resource.Name -Status
    $currentStatus = $vm.Statuses | Where-Object Code -like 'PowerState*'
    $currentStatus.Code -replace 'PowerState/',''
    }
    StartAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'start' -Force }
    DeallocateAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'deallocate' -Force }
    },
    @{
    ResourceType = 'Microsoft.Compute/virtualMachineScaleSets'
    #since there is no way to get the status of a VMSS, we assume it is in the inverse state to force the action on the whole VMSS
    PowerStateAction = { param([object]$Resource, [string]$DesiredState) if($DesiredState -eq 'StoppedDeallocated') { 'Started' } else { 'StoppedDeallocated' } }
    StartAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'start' -Parameters @{ instanceIds = @('*') } -Force }
    DeallocateAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'deallocate' -Parameters @{ instanceIds = @('*') } -Force }
    }
    )

    # Define function to get current date using the TimeZone Paremeter
    function GetCurrentDate
    {
    return [system.timezoneinfo]::ConvertTime($(Get-Date),$([system.timezoneinfo]::GetSystemTimeZones() | ? id -eq $Timezone))
    }

    # Define function to check current time against specified range
    function Test-ScheduleEntry ([string]$TimeRange)
    {
    # Initialize variables
    $rangeStart, $rangeEnd, $parsedDay = $null
    $currentTime = GetCurrentDate
    $midnight = $currentTime.AddDays(1).Date

    try
    {
    # Parse as range if contains '->'
    if($TimeRange -like '*->*')
    {
    $timeRangeComponents = $TimeRange -split '->' | foreach {$_.Trim()}
    if($timeRangeComponents.Count -eq 2)
    {
    $rangeStart = Get-Date $timeRangeComponents[0]
    $rangeEnd = Get-Date $timeRangeComponents[1]

    # Check for crossing midnight
    if($rangeStart -gt $rangeEnd)
    {
    # If current time is between the start of range and midnight tonight, interpret start time as earlier today and end time as tomorrow
    if($currentTime -ge $rangeStart -and $currentTime -lt $midnight)
    {
    $rangeEnd = $rangeEnd.AddDays(1)
    }
    # Otherwise interpret start time as yesterday and end time as today
    else
    {
    $rangeStart = $rangeStart.AddDays(-1)
    }
    }
    }
    else
    {
    Write-Output "`tWARNING: Invalid time range format. Expects valid .Net DateTime-formatted start time and end time separated by '->'"
    }
    }
    # Otherwise attempt to parse as a full day entry, e.g. 'Monday' or 'December 25'
    else
    {
    # If specified as day of week, check if today
    if([System.DayOfWeek].GetEnumValues() -contains $TimeRange)
    {
    if($TimeRange -eq (Get-Date).DayOfWeek)
    {
    $parsedDay = Get-Date '00:00'
    }
    else
    {
    # Skip detected day of week that isn't today
    }
    }
    # Otherwise attempt to parse as a date, e.g. 'December 25'
    else
    {
    $parsedDay = Get-Date $TimeRange
    }

    if($parsedDay -ne $null)
    {
    $rangeStart = $parsedDay # Defaults to midnight
    $rangeEnd = $parsedDay.AddHours(23).AddMinutes(59).AddSeconds(59) # End of the same day
    }
    }
    }
    catch
    {
    # Record any errors and return false by default
    Write-Output "`tWARNING: Exception encountered while parsing time range. Details: $($_.Exception.Message). Check the syntax of entry, e.g. ' -> ', or days/dates like 'Sunday' and 'December 25'"
    return $false
    }

    # Check if current time falls within range
    if($currentTime -ge $rangeStart -and $currentTime -le $rangeEnd)
    {
    return $true
    }
    else
    {
    return $false
    }

    } # End function Test-ScheduleEntry


    # Function to handle power state assertion for resources
    function Assert-ResourcePowerState
    {
    param(
    [Parameter(Mandatory=$true)]
    [object]$Resource,
    [Parameter(Mandatory=$true)]
    [string]$DesiredState,
    [bool]$Simulate
    )

    $processor = $ResourceProcessors | Where-Object ResourceType -eq $Resource.ResourceType
    if(-not $processor) {
    throw ('Unable to find a resource processor for type ''{0}''. Resource: {1}' -f $Resource.ResourceType, ($Resource | ConvertTo-Json -Depth 5000))
    }
    # If should be started and isn't, start resource
    $currentPowerState = & $processor.PowerStateAction -Resource $Resource -DesiredState $DesiredState
    if($DesiredState -eq 'Started' -and $currentPowerState -notmatch 'Started|Starting|running')
    {
    if($Simulate)
    {
    Write-Output "`tSIMULATION -- Would have started resource. (No action taken)"
    }
    else
    {
    Write-Output "`tStarting resource"
    & $processor.StartAction -ResourceId $Resource.ResourceId
    }
    }

    # If should be stopped and isn't, stop resource
    elseif($DesiredState -eq 'StoppedDeallocated' -and $currentPowerState -notmatch 'Stopped|deallocated')
    {
    if($Simulate)
    {
    Write-Output "`tSIMULATION -- Would have stopped resource. (No action taken)"
    }
    else
    {
    Write-Output "`tStopping resource"
    & $processor.DeallocateAction -ResourceId $Resource.ResourceId
    }
    }

    # Otherwise, current power state is correct
    else
    {
    Write-Output "`tCurrent power state [$($currentPowerState)] is correct."
    }
    }

    # Main runbook content
    try
    {
    $currentTime = GetCurrentDate
    Write-Output "Runbook started. Version: $VERSION"
    if($Simulate)
    {
    Write-Output '*** Running in SIMULATE mode. No power actions will be taken. ***'
    }
    else
    {
    Write-Output '*** Running in LIVE mode. Schedules will be enforced. ***'
    }
    Write-Output "Current UTC/GMT time [$($currentTime.ToString('dddd, yyyy MMM dd HH:mm:ss'))] will be checked against schedules"


    $Conn = Get-AutomationConnection -Name AzureRunAsConnection
    $resourceManagerContext = Add-AzureRMAccount -ServicePrincipal -Tenant $Conn.TenantID -ApplicationId $Conn.ApplicationID -CertificateThumbprint $Conn.CertificateThumbprint

    $resourceList = @()
    # Get a list of all supported resources in subscription
    $ResourceProcessors | % {
    Write-Output ('Looking for resources of type {0}' -f $_.ResourceType)
    $resourceList += @(Find-AzureRmResource -ResourceType $_.ResourceType)
    }

    $ResourceList | % {
    if($_.Tags -and $_.Tags.ContainsKey($autoShutdownOrderTagName) ) {
    $order = $_.Tags | % { if($_.ContainsKey($autoShutdownOrderTagName)) { $_.Item($autoShutdownOrderTagName) } }
    } else {
    $order = $defaultOrder
    }
    Add-Member -InputObject $_ -Name ProcessingOrder -MemberType NoteProperty -TypeName Integer -Value $order
    }

    $ResourceList | % {
    if($_.Tags -and $_.Tags.ContainsKey($autoShutdownDisabledTagName) ) {
    $disabled = $_.Tags | % { if($_.ContainsKey($autoShutdownDisabledTagName)) { $_.Item($autoShutdownDisabledTagName) } }
    } else {
    $disabled = '0'
    }
    Add-Member -InputObject $_ -Name ScheduleDisabled -MemberType NoteProperty -TypeName String -Value $disabled
    }

    # Get resource groups that are tagged for automatic shutdown of resources
    $taggedResourceGroups = Find-AzureRmResourceGroup -Tag @{ "AutoShutdownSchedule" = $null }
    $taggedResourceGroupNames = @($taggedResourceGroups | select Name)

    Write-Output "Found [$($taggedResourceGroupNames.Count)] schedule-tagged resource groups in subscription"

    if($DefaultScheduleIfNotPresent) {
    Write-Output "Default schedule was specified, all non tagged resources will inherit this schedule: $DefaultScheduleIfNotPresent"
    }

    # For each resource, determine
    # - Is it directly tagged for shutdown or member of a tagged resource group
    # - Is the current time within the tagged schedule
    # Then assert its correct power state based on the assigned schedule (if present)
    Write-Output "Processing [$($resourceList.Count)] resources found in subscription"
    foreach($resource in $resourceList)
    {
    $schedule = $null

    if ($resource.ScheduleDisabled)
    {
    $disabledValue = $resource.ScheduleDisabled
    if ($disabledValue -eq "1" -or $disabledValue -eq "Yes"-or $disabledValue -eq "True")
    {
    Write-Output "[$($resource.Name)]: `r`n`tIGNORED -- Found direct resource schedule with $autoShutdownDisabledTagName value: $disabledValue."
    continue
    }
    }

    # Check for direct tag or group-inherited tag
    if($resource.Tags.Count -gt 0 -and $resource.Tags.ContainsKey($autoShutdownTagName) -eq $true)
    {
    # Resource has direct tag (possible for resource manager deployment model resources). Prefer this tag schedule.
    $schedule = $resource.Tags.Item($autoShutdownTagName)
    Write-Output "[$($resource.Name)]: `r`n`tADDING -- Found direct resource schedule tag with value: $schedule"
    }
    elseif($taggedResourceGroupNames -contains $resource.ResourceGroupName)
    {
    # resource belongs to a tagged resource group. Use the group tag
    $parentGroup = $resourceGroups | Where-Object Name -eq $resource.ResourceGroupName
    $schedule = $parentGroup.Tags.Item($AUTOSHUTDOWNSCHEDULE_KEYWORD)
    Write-Output "[$($resource.Name)]: `r`n`tADDING -- Found parent resource group schedule tag with value: $schedule"
    }
    elseif($DefaultScheduleIfNotPresent)
    {
    $schedule = $DefaultScheduleIfNotPresent
    Write-Output "[$($resource.Name)]: `r`n`tADDING -- Using default schedule: $schedule"
    }
    else
    {
    # No direct or inherited tag. Skip this resource.
    Write-Output "[$($resource.Name)]: `r`n`tIGNORED -- Not tagged for shutdown directly or via membership in a tagged resource group. Skipping this resource."
    continue
    }

    # Check that tag value was succesfully obtained
    if($schedule -eq $null)
    {
    Write-Output "[$($resource.Name) `- $($resource.ProcessingOrder)]: `r`n`tIGNORED -- Failed to get tagged schedule for resource. Skipping this resource."
    continue
    }

    # Parse the ranges in the Tag value. Expects a string of comma-separated time ranges, or a single time range
    $timeRangeList = @($schedule -split ',' | foreach {$_.Trim()})

    # Check each range against the current time to see if any schedule is matched
    $scheduleMatched = $false
    $matchedSchedule = $null
    $neverStart = $false #if NeverStart is specified in range, do not wake-up machine
    foreach($entry in $timeRangeList)
    {
    if((Test-ScheduleEntry -TimeRange $entry) -eq $true)
    {
    $scheduleMatched = $true
    $matchedSchedule = $entry
    break
    }

    if ($entry -eq "NeverStart")
    {
    $neverStart = $true
    }
    }
    Add-Member -InputObject $resource -Name ScheduleMatched -MemberType NoteProperty -TypeName Boolean -Value $scheduleMatched
    Add-Member -InputObject $resource -Name MatchedSchedule -MemberType NoteProperty -TypeName Boolean -Value $matchedSchedule
    Add-Member -InputObject $resource -Name NeverStart -MemberType NoteProperty -TypeName Boolean -Value $neverStart
    }

    foreach($resource in $resourceList | Group-Object ScheduleMatched) {
    if($resource.Name -eq '') {continue}
    $sortedResourceList = @()
    if($resource.Name -eq $false) {
    # meaning we start resources, lower to higher
    $sortedResourceList += @($resource.Group | Sort ProcessingOrder)
    } else {
    $sortedResourceList += @($resource.Group | Sort ProcessingOrder -Descending)
    }

    foreach($resource in $sortedResourceList)
    {
    # Enforce desired state for group resources based on result.
    if($resource.ScheduleMatched)
    {
    # Schedule is matched. Shut down the resource if it is running.
    Write-Output "[$($resource.Name) `- P$($resource.ProcessingOrder)]: `r`n`tASSERT -- Current time [$currentTime] falls within the scheduled shutdown range [$($resource.MatchedSchedule)]"
    Add-Member -InputObject $resource -Name DesiredState -MemberType NoteProperty -TypeName String -Value 'StoppedDeallocated'

    }
    else
    {
    if ($resource.NeverStart)
    {
    Write-Output "[$($resource.Name)]: `tIGNORED -- Resource marked with NeverStart. Keeping the resources stopped."
    Add-Member -InputObject $resource -Name DesiredState -MemberType NoteProperty -TypeName String -Value 'StoppedDeallocated'
    }
    else
    {
    # Schedule not matched. Start resource if stopped.
    Write-Output "[$($resource.Name) `- P$($resource.ProcessingOrder)]: `r`n`tASSERT -- Current time falls outside of all scheduled shutdown ranges. Start resource."
    Add-Member -InputObject $resource -Name DesiredState -MemberType NoteProperty -TypeName Boolean -Value 'Started'
    }
    }
    Assert-ResourcePowerState -Resource $resource -DesiredState $resource.DesiredState -Simulate $Simulate
    }
    }

    Write-Output 'Finished processing resource schedules'
    }
    catch
    {
    $errorMessage = $_.Exception.Message
    throw "Unexpected exception: $errorMessage"
    }
    finally
    {
    Write-Output "Runbook finished (Duration: $(('{0:hh\:mm\:ss}' -f ((GetCurrentDate) - $currentTime))))"
    }

    Viewing all 96 articles
    Browse latest View live