Using Gmail Account with SMTP Setup with Dynamics Business Central / Dynamics NAV


Not all customers uses Office 365 as their e-mail. The other popular option companies use is G Suite from Google.

When you’re trying to setup the SMTP Mail using Gmail accounts in G Suite, you may encounter this error:

The mail system returned the following error: “Failure sending mail.
Unable to read data from the transport connection: net_io_connectionclosed.”.


The problem is with how Google detects which application it deems as less secure. If you’re using an application it deems less secure, Google will refuse the connection.

What you’ll need to do is change your settings on Gmail.

On the Less Secure App access, you will need to turn this on.


Now when you go back to your Dynamics 365 Business Central (aka Dynamics NAV) application, you will be able to send the mail from the SMTP settings.

Dirty Trick to Keep Your Customers Hostage


Through whatever reasons, the customers will want to work with one partner over another. It may be the lack of knowledge, lack of support, slow response, or whatever. The customer can freely choose who they want to work with in regards to Dynamics 365 Business Central (aka Dynamics NAV).

Of course for the Microsoft NAV partner that the customer is leaving in question, they do not like that. They want to keep the customer on and collect money without doing the work.

It also leads to hurt feelings from the partner in question. I would equate that feeling as being dumped by your significant other.

Nonetheless, one of the best “features” that’s not emphasized enough in the Dynamics 365 Business Central (aka Dynamics NAV) channel is that if the customer does not want to work with a particular company, they can freely choose another one without much problems.

You’re not dealing with Microsoft, you’re dealing with partners that work with Microsoft.

Old Trick to Prevent You from Leaving

There are a couple of dirty tricks that the industry employs to hold the customer hostage.

One of the oldest trick is through licensing. They sell you add-ons that are in the 30 million ranges that only THEIR company have access to.

Often times, the customers will not know they’re in a restricted add-on ranges, because quite frankly, when you’re getting into a new marriage, nothing can go wrong!

I’ve personally met with a few customers that ended up switching to a different ERP software because the previous partner wouldn’t release their code to other NAV partners.

What can you do? Threaten a lawsuit? Or did you read the print when you bought NAV from that company? If you did, did you understand what you were signing?

I fully understand why Microsoft does not want to get involved in these situations. The original selling partner will claim that their add-on is their secret IP and don’t want other people to “steal” it.

On the other hand, if the NAV partner suck or just don’t have the resource available, only the customer will suffer.

This topic should probably warrant a separate blog article since the question of Extensions comes into play. Basically any extensions can be treated like the objects in the 30 million ranges.

Dirty Trick to Prevent You from Leaving

If the customer refuses to purchase the partner’s add-on. The other way is to lock down the database.

How do we know this?

Recently, we worked with a new client that came to us because they didn’t like their previous partner. When we got our first modification request, I was horrified to find that we couldn’t modify any objects in the database.

This is even when I have the sysadmin role on SQL Server!

The error is:

The following SQL Server error or errors occurred when accessing the Object table: 50000,"42000",[Microsoft][SQL Server Native Client 11.0][SQL Server]You do not have permission to modify objects.

UPDATE [NAV2016].[dbo].[Object] SET [Compiled]=0,[BLOB Reference]= NULL,[BLOB Size]=924,[Date]={ts '2018-10-11 00:00:00.000'},[Time]={ts '1754-01-01 17:37:53.955'} WHERE ([Type]=1 AND [Company Name]='' AND [ID]=2000000071)

Asking Around

Coincidentally, I was at a Dynamics 365 Business Central conference (Directions) where all of the best NAV experts are in one place for me to pick brains from.

When I showed this error around, everyone was stump and had no idea what’s causing this error. I even approach some of the Dynamics NAV MVPs and they asked the usual questions:

1. Are you sysadmin?
2. Do you have administrator on that computer?

No one had seen this error.

The Solution

Finally, Floyd Chan from Qixas Group gave me a hint that if he was devious, he would put code directly in SQL Server tables. More specifically, in the triggers.

Sure enough, checking the Triggers on the Object table in the SQL table, sure enough, we found this:

After removing the table trigger from the Object table, we were freely able to modify the database objects again.


Over the years, I’ve seen a lot of weird and questionable things.

This, to me, left a really bad taste in my mouth. Partners will go as far as locking down the database to prevent other developers on working with them.

I just want to put this out there so people will not have to suffer through this as well. Once is enough.

If this wasn’t figured out, I wonder what it would cost to have the original partner “unlock” the object table

Dynamics NAV Extensions – A Potential Weapon of Mass Destruction

With the release of Dynamics NAV 2017 and Dynamics 365 for Financials (which is technically NAV2017), the buzz word in the Dynamics NAV development community is the development function called Extensions.

What are Extensions?
Extensions is a way for NAV Developers to put modifications in your live environment without modifying your core NAV system.

How Does it Work?
The extension basically implements the code on the Service Tier instead of the NAV development environment. The clients connects to NAV using the service tier, this is how you see your NAV in whatever state it’s in.

Implementing the code directly in the service tier means that you won’t see any of the modifications if you go into the NAV development environment. You can only see the new tables created and table field changes if you look into the SQL tables directly.

Why is the Development Community Buzzing About This?
I honestly don’t know. I think the hype of Microsoft releasing something new have people gagaing and swept up into the hype.

I remember when Microsoft first announced the idea of multi-tenant for NAV services. Even though people didn’t fully understand it and it didn’t apply to 99% of the NAV population, the community still tauted like it was something that was the second coming. But… That’s hype and marketing for you.

At its current state, the NAV Extension will benefit partners and ISV in that they can quickly take their code and implement it on customer sites. In addition, it gives partners, ISVs, and independent NAV developers to protect their code from everyone else that does not have the “source code”.

For customers, the benefit is that the partners won’t charge too much for putting modifications that they’ve developed for other customers. They will now have a wide range of solutions that are developed in extensions they can essentially “bolt on”.

Lastly, it’s the only way to get your IP (intellectual property) on the Microsoft AppSource.

Why it’s a Weapon of Mass Destruction for a Company?

Notice the statement I made above:

It gives partners, ISVs, and independent NAV developers to protect their code from everyone else that does not have the “source code”

By implementing the code on directly on the service tier, nobody in the world will be able to modify what the original developer has done. The code that your partner/ISV/internal developer has put in is effectively sealed off from the rest of the world.

This means that if you are, for whatever reason, unhappy with your partner/ISV/internal developer, you better make sure, as a company, you have the original source code on which they built their extensions on.

Playing the devil’s advocate and assuming the worst scenarios with extensions

If you have a NAV or ISV partner that delivered their modifications to you as extensions and they provide terrible service and lacks basic knowledge of the product. If you want to ask another NAV partner or a freelancer to come in and help you, you will need the blessing of your previous NAV partner and pray that they are cooperative in provide the source codes.

If you purchased an ISV for your business, however, there are still some missing features on the ISV that you need your partner to add. Your partner will be unable to help you. You will need to reply in the ISV to re-release the extension for you.

If you have your own internal NAV developer on staff. If you need to terminate his employment for whatever reason, it’s very possible he/she can take the source code with them and leave you high and dry. No partner can come to your aide and you essentially have to live with what you have.

Even now, if you purchase an add-on in the 30 million object ranges, you’re effectively held prisoner by your ISV or the partner. There are special ways to “hack” the 30 million object ranges within SQL, but it’s usually messy and will incur a lot of additional expense on your part.

However, with Extensions at it’s current form, it will be impossible for any NAV developer to come help you if you do not have the original source code for the extensions.

Dynamics NAV (Navision) has always been an open source software. Implementing a structure where you effectively seal off the code is troubling.

It seems like before you do any kind of Extensions development with your partner or customer, you will need to lawyer up first. And as we all know, once you get lawyers into the mix, it just goes downhill from there.

Why it makes sense to upgrade to (at least) Dynamics NAV 2015

As you are all probably know, Microsoft has announced the next release of Dynamics NAV, called Microsoft Dynamics NAV 2016. With this release, there are a lot of fantastic features that are out of the box. Here’s an image of the planned features releases for Dynamics NAV 2016:


Historically, there are 2 major time consuming portion of an upgrade:

  • Code Merge
  • Data Upgrade

However, since the release of Dynamics NAV 2015, Microsoft introduced a number of PowerShell cmdlet to automate the upgrade process.

One of the cmdlet in the Development Shell called Merge-NAVApplicationObject, it essentially merges the application code for you and spits out any conflicts for you to resolve manually.

Additionally, Microsoft also introduced the cmdlet Start-NavDataUpgrade that will essentially run the upgrade toolkit for all companies as a script.

This means what used to take hundreds (sometimes thousands) of hours for an upgrade can now be done at a fraction of the time!

Not only major upgrades, you can use the same method to apply hotfixes that are released every month into your database without spending tons of hours on consulting fees.

All this… Providing you upgrade to at least Dynamics NAV 2015.

The Strategy Behind This
Prior to Dynamics NAV 2015, clients would wait, and wait, and wait, then wait some more until they find a version with the improvements they like; or sometimes wait until they can’t run Dynamics NAV in their current operating system before they even consider upgrading. This is why you see some end user sites running version 5.0, 4.0, 3.7, or even 2.0!

The problem when you’re behind on the versions in any software is that when you do decide to upgrade, it’ll be a major impact on your business and your budget. In some cases, companies running on older version of Dynamics NAV (or Navision as it was called back then) may decide to get rid of Navision and make the mistake of moving into a competing product.

This is a threat to Microsoft and they realize this. With the Upgrade cmdlets, Microsoft finally has a solution to this.

What Microsoft Hopes You Do
With these improvements, instead of having a major upgrade, you would do “incremental” upgrades (hotfixes every month, R2 releases, etc). This means that you would implement Dynamics NAV, apply all the monthly release hotfixes with the cmdlets, then use the same method to upgrade to the newer version when it’s released.

This goal is that when you guys do go to a newer version, the impact on your business and your budget will be incredibly minor; while the benefits for the newer version will have greater beneficial impact for your business.

What Microsoft do not want you to do is to wait and then do a big version jump with a big upgrade cost.

As any developer that have use the upgrade cmdlets will tell you, it’s good but it’s not perfect. However, it’s a FANTASTIC starting point.

What my personal hope is that the upgrade process will be so easy with future releases that the end user can run the upgrade themselves, rather than to hire a consultant to do it for them.


Optimizing your Aged Accounts Receivables Report

Doing numerous upgrades from an older version of Navision to NAV 2015 and 2013, one common complaint is how slow the reports are running. This is especially true for larger reports like the Aged Accounts Receivables, Aged Accounts Payables , Inventory Valuation, etc.

The old reports that took a long time, such as the Inventory Valuation report, will still take a long time. It doesn’t matter what version you go to. However, there are some reports that used to be quick, but is slow after the upgrade.

One of these reports is the Aged Accounts Receivables report (Report ID: 10040 – Aged Accounts Receivable).

The Breakdown
CAUTION: I’m about to get “programmer”. If you want the faster report, just skip down to the bottom and download the object.

Removing the Data Type of Column from the report, we get the following DataItem that the report loops through:


On initial look, the report looks simple enough. There are 3 data items:

  • Customer – The report loops through the customer record to see which customer we need to calculate the aging for
  • Cust. Ledger Entry – For every customer record it finds, it will loop through the all of the customer ledger for that customer. For any customer ledger that has a remaining amount, it’ll put it into a temporary table
  • Integer – This dataitem loops through the same records that are inserted into the temporary table on the Cust. Ledger Entry dataitem and summerizes the information to display in different aging “buckets”.

The Problem
The reason why this report is slow is if you check the DataItemTableView property on the Cust. Ledger Entry dataitem, you’ll see that the report is looping through ALL of the customer ledger for that customer.


This report will run fine if you’re A/R aging is small. However, this report will get slower as time progresses with more transactions. Worst, it’ll consume all the memory on the server and force you to restart.

The problem becomes real apparent when you have EDI customers that are running hundreds of invoices per day.

The Solution
The idea of the original A/R aging report is correct. Basically, look at the remaining amount based on the date criteria; if there is a balance, then it goes into the calculation.

The problem is that it’s not running any type of filters to exclude old transactions that has no relevance in our calculation.

To address the performance problem, here are the main things we will need to do:

  1. Look at only transactions that are marked as Open
  2. If the report is to be backdated, look into the only the history that pertains to the date criteria

First thing we do is to set the proper DataItemTableView property with the filter of Open.


Then we need to add a new Detailed Cust. Ledger Entry dataitem to look at the application history of our A/R transaction:

The Property:


The Code:


Basically, we’re limiting the reads of the database to only open transactions and the subsequent A/R applications from the Aged as of Date set on the report.

Here are the report objects and the text file for your reference:

Not sure what the developer at Microsoft is thinking when programming this report. Aged Accounts Receivable/Payable is one of the most data intensive reports next to the inventory valuation. Reading through every record just does not make sense.

Yes, it’ll work in the short run, but give a few years and the report will slow to a crawl, which is already experienced by customers upgrading.

Restarting Job Queue when it Hits an Error on Dynamics NAV 2013

Job Queue
In Dynamics NAV 2013, you can setup job queue to automate processing of tasks. Popular tasks that are automated are things such as Adjust Cost – Item Entries, EDI processing, Reporting, etc.

Setting up job queue in NAV2013 couldn’t be much easier can a step by step instruction can be found here.

The Problem
When the job queue runs into an error, it will never get picked up again. This means that while the Job Queue is an automated process, the IT manager will need to monitor this every day to make sure every process is running.

I know what you’re thinking, “This does not make sense!”. I fully agree.

There are some processes where the error is fatal. This is the reason why you would not want to have it run again. However, there are some situations where the error occurs when the table is locked by another process, in this case, you absolutely need to have it restart again.

This is especially true when you process EDI orders. You have to send back acknowledgment and/or confirmation within a certain timeframe or else you’ll get charge backs. Having the job queue error out because of table locks does not make too much sense.

The Solution
The problem lies in the Job Queue Dispatcher codeunit (448). If you go down and find the local function GetNextRequest, you’ll see that for some odd reason, the process is only looking at any statuses that are Ready.JobQueueReadySo we will need to modify the code to scan for the error entries as well.


Depending on what you use the Job Queue for, I would include job queues that are In Process. The reason is if it’s running and someone stops the job queue, it’ll stay stuck in the In Process status.

By setting this, it’s important that you set the Max No. of Attempts. You don’t want the job queue to keep running if there really is a critical error.

A special thanks to Rafael Rezende for helping me with this!

Thought Process on Receiving Defective Inventory in Dynamics NAV

Receiving items from vendor can be a tricky thing. This question has come up quite a bit during implementation regarding defective inventory. I know a lot of companies has put a lot of modifications into receiving defective inventory, I’d like to propose an out of the box solution to receiving goods that have some defective quantities.

Here is the scenerio:
1. A purchase order came in on a container for ItemA with 100 pieces
2. Of the 100 pieces, 30 are damaged and they want to reject these pieces and send them back to the supplier
3. The user wants to be able to return the 30 pieces to the supplier and revert the Qty. to Receive to 30 to indicate there is still 30 to be received

Before we get into processing this, I want to bring up an important concept a person taught me when I was starting out doing Dynamics NAV:

Processing any task on a computer is no different than if you were to process processing a task manually.

This is a very important concept that changed the way I thought about automation, implementation, and how we can use Dynamics NAV (or any other computer software) to help us. This is especially true in the world of accounting where paper trail is everything. This concept is literally the difference between 100 hour modifications or 2 hour training session.

The Quick and Dirty Way
Looking at this problem, the natural instinct would be to do this:
1. Use the undo receipt function to undo what I’ve received
2. Post the receipt of the correct 70 quantities so the PO would look like it has 30 remaining on the original line

Here are the problems with the above approach:
1. Undo Receipt will not work if the received quantity has been sold
2. You have no record to match up with the vendor bill of lading
3. There is no record of the return to the vendor

Realistically speaking, is the truck unloading the shipment going to wait for you to do QC on the pieces received? The trucker has people to see, places to go, the trucker is a driver and probably does not even work for the vendor you bought the stuff from.

Being a developer with limited knowledge of operations, this would be the process that’s the easiest. For accounting, they want it easy and just want everything back to how it should be. For warehouse and operations, just do it and avoid the system at all cost.

This process would work in the perfect world. The problem arises if there are disputes, lost items, vendor reconciliations where they said you received some stuff but your records says you didn’t.

As I’ve said it time and time before, skip any data entered into the system you want as long as the financials balances out. The problem only arises when it’s time to reconcile.

The Manual Way
Going back to the original concept that’s laid out before, let’s solve the problem on the same issue if we had no computer in front of us. Here’s what I would do.

1. Unload the goods from the truck and sign the BOL because the trucker need to leave
2. Place the goods in a holding area in the warehouse to be checked/put-away
3. As the goods are checked, move the defective items into a separate area in the warehouse and put the good pieces in the warehouse bins
4. Call the vendor and say “Dude, your product is broken, I’m gonna return it. I also need you to replace the 30 that’s defective.”
5. Arrange transportation, prepare packing list, bill of lading, (if international, prepare commercial invoice, etc).

Bring in the System
Knowing how we process this manually, we can then replicate this into the system, in our case, Dynamics NAV:
1. Receive the items in full into the QC location (or into your main warehouse and into the QC bin if you’re using WHM)
2. As the items are checked, the good items should be moved to your main warehouse or bins using the item reclass journal or movement worksheet
3. Create Purchase Return Order for the defective goods. Use the document to generate the packing list, commercial invoice, etc.
4. Add a line to the purchase order to indicate replacement of the additional goods

Doing the above will create some additional steps and data entry, but it also has the benefit of having a paper trail. You need the full story on what happened to the PO. You need to show that you received 100 pieces originally, returned 30, then received the additional 30.

Each of the steps above can be handled by one person, but it should split out to ensure checks and balances. For example, it’s not realistic for the warehouse guys to call your vendor asking for a return.

In addition, it’s not realistic to generate a return every time there’s a defective part especially if the vendor is in another country. For returns dealing with international vendors, the company will ususally accumulate all fo the defective parts until they can fill a container, then process the return in one shot.

Again, doing things the proper way will create some additional steps, but I’m assuming you bought Dynamics NAV to help you organize your business, not creating more mess. Not every tasks requested by the end users will make sense in the long term.

It’s really up to the consultants to challenge the end user’s way of thinking and what’s the proper way to process certain business tasks. If you find that every request you made to the consultant always results in additional modification instead of training, you probably need to challenge your own request to see if your request makes sense if you were to do it manually.

Magento and Dynamics NAV (Navision)

Magento is an unstoppable force.

There are NAV e-commerce add-ons out there, however, the websites that are created, at least from what we’ve seen, looks very outdated and “old”. In addition, it’s hard to add features and customize the website to your liking. The end result is probably not the best foot you want to put forward if your e-commerce webstore is the first thing your potential customers sees about your company.

Even if the integration works flawlessly, if you have a terrible looking website that’s hard to navigate in, you’ve just wasted your money. This is true for B2C (Business to Consumer) or B2B (Business to Business) websites.

The infrastrucutre is basically the same, to have real time, you must host the the web server in house. Not a lot of companies like to do that because of reliability and the cost (IT people, hardware, software, etc) of hosting the most vital order taking system for a company inhouse.

When it’s hosted, then we’re just uploading and downloading data and syncing anyways, so there’s really not that much benefit for you to purchase an expensive e-commerce add-on for Navision.

This is where Magento comes into play. Magento is an open source webstore software that’s gaining in popularity. How do I know this? Well, I hear a lot of new softwares and services that our clients are excited about. Usually when I hear a product once or twice, I’ll make a mental note.But more than that, it will require some investigation because I know the next thing the client will ask for is integration.

There are usually 2 components when designing the integration piece:
1. Getting the data to the Magento site
2. Getting the data back from Magento to Navision. i.e. approval amounts

There are a couple of ways to go about it:
1. Webservice directly to your database
2. Flat file transfer to Magento web database
3. Pump data to SQL Express and have Magento do query on it

I’m not a fan of having web services connected directly to your production database as, depending on the traffic of your site, it may cause performance problems. You probably don’t want people around the world to be querying into your production database when the customer service people are on the phone with your customers.

Personally, I prefer options 2 and 3 because if the website is down, you still have your ERP to take phone orders. If your ERP is down, you still have your website to take orders.

For real time, or as close to real time, you can use NAS to pump data in/out as much as you like, which is what option 2 and option 3 is for.

Your Magento developer(s) shouldn’t have any problems with the import/export of the data you give them.

There are probably a lot of other methods of integrating your Dynamics NAV (Navision) solution to Magento. The important thing is to not get caught up with what you need and what you’re being sold. Usually simple is good and simple is more than enough.

Having said that, I’d love to hear some other methods you use to integrate Dynamics NAV and Magento. If Magento is the next best thing for web stores since slice bread, us Dynamics NAV (Navision) community should be ready for it.

Fedex API to be Discontinued for E-Ship

In case you haven’t heard, Fedex will discontinue the use of their old API used by Lanham’s E-Ship module for Fedex Integration. The date that the old API will be discontinued will be May of 2012. While that’s still a couple of months out, it may be worthwhile to plan an upgrade of E-Ship so you can continue using the Fedex Ingration granule for Lanham without interruption when the time comes.

This is a long time coming. Fedex had already announced that they will discontinue the API a while ago. It has taken awhile for the folks at Lanham to develop the new interface to talk with Fedex.

The old Fedex API has it’s share of problems. The more notable one is that it does not support Windows 7. With new machines from Fedex shipped with Windows 7 pre-installed, I’m sure you’re IT department will be pretty happy about the update.

CORRECTION: The web service for E-Ship is NOT the NAV2009 Web services. It’s the Fedex Web Services. So this means that you can still use version 3.7A (the lowest version you have to be on) and still implement the Fedex Web Services. You WILL need to be current on the Lanham enhancement in order to use the Fedex Web Service. The version of E-Ship will need to be at least SE0.54.18.

This is yet another reason for end users to stay current on their enhancement. If you missed the “Sky’s the Limit” offer and you wish to continue shipping using Lanham’s Fedex integration, it’s time to plan for an alternative solution or get current on the enhancement by any means possible. Microsoft enhancement amnesty will end on June 30th, 2011.

You can register to see the new Fedex web services from the webinar hosted by Lanham here:
Thursday, June 23rd at 2:00 p.m. to 3:00 p.m. EST
Register Now!

Navision RDLC reporting – SetData and GetData – Why It Is REQUIRED

Ever wondered why there’s no tutorial on how to create a Sales Order report from scratch in the RDLC? The reason is because it takes a LONG TIME! Even for an experienced developer, it takes a long time. As I previously mentioned on my article, Microsoft really needs to address this in future versions.

The reason for SetData and GetData is not because of performance reason as stated in the manual 80146B. For additional information on defining SetData and GetData, please look here.

For multiple pages, the header data is dependant on whether there are lines. If you’re printing multiple form type reports like the sales order and you do not use the SetData and GetData, the header will only link to the lines displayed on the first page of the report. So this means that if your sales order is printed to the 2nd page, the header information will all disappear.

Here’s an example if you create a report without using the SetData and GetData logic:

This is the first page. As you can see, the header displays nice and pretty. I used whiteout to remove some sensitive information in Paint.

Now this is what happens when you print the 2nd page:

No, it’s not an error. You’re seeing it correct. It’s a blank page. I didn’t even have to use Paint to remove any information.

The reason why the 2nd page is blank, again, is because the link was done only on the first page on the header. If the report goes to the 2nd page, the link is essentially gone, therefore, no value is loaded and so nothing is displayed.


So when you create a report that has headers in forms (sales order, quote, etc). You need these:

Shared Offset As Integer
Shared NewPage As Object
Public Function GetGroupPageNumber(NewPage As Boolean, PageNumber As Integer) As Object
    If NewPage
        Offset = PageNumber – 1
        NewPage = False
    End If
    Return PageNumber – Offset
End Function
Public Function IsNewPage As Boolean
    NewPage = True
    Return NewPage
End Function
Shared HeaderData As Object
Public Function GetData(Num As Integer) As Object
   Return Cstr(Choose(Num, Split(Cstr(HeaderData),Chr(177))))
End Function
Public Function SetData(NewData As Object)
    If NewData <> “” Then
        HeaderData = NewData
    End If
End Function

 And you need these controls with the proper code:

We spent hours and hours trying to get our report header to print on multiple pages. Don’t make the same mistakes we did!

EDIT – Thanks to Steven for pointing this out. It turns out that this was mentioned on the 80146B manual on Chapter 3 page 35. Shows you that you shouldn’t go through the 300+ page manual quickly!