The BackHub Roadmap for 2020
We’re excited about taking BackHub to the next level this year!
Next level means all things enterprise.
As last year many more enterprise customers requested our service, we began focusing on features like the audit log and security history to help enterprise customers with compliance. To see what we built in 2019, go here.
We highly value customer feedback and also like to be transparent about our plans. These plans are based both on customer feedback and keeping up with evolving industry standards and trends.
In this post I’ll share our priority projects for 2020.
This truly is a new level for BackHub. The Private Instance has changed in many ways how we think about BackHub and has had a big influence on software architecture decisions. First, we have removed everything that didn’t belong to the core backup functionality (for example, billing logic) and then added those things back again as “add ons”, solely used in the cloud instance. This allows the use of one repository instead of branching out or even forking and avoids backporting and error-prone tests. Beyond that, all new features and improvements are automatically part of both cloud and private instance, which means we can get early feedback, detect regressions and fix bugs much faster.
The Private Instance is a new deployment option for enterprise customers who have a high volume of repositories and organizations, or who have security requirements that make it difficult for a third party to manage their backups.
In 2020-Q1 we will roll out the pilot for Private Instance on AWS. If you are interested in taking part in the pilot program and benefit from a 20% discount on the first year, read more here.
In 2020-Q2 we will follow up with a deployment option for the Azure cloud, as this is a popular cloud hosting provider for the enterprise.
Choose Cloud Region
Some customers have the legal requirement to keep their data based in the USA. About 50% of our customers are based in the USA, and want the option to choose to store their backups there.
With our cloud infrastructure and improved modular architecture of the private instance, we are well prepared to provide a second cloud instance that operates in the USA in addition to the one we operate in the EU region.
Some of our customers use Git LFS. Git Large File Storage (LFS) replaces large files such as audio samples, videos, datasets, and graphics with text pointers inside Git, while storing the file contents on a remote server at GitHub.
To include these files in the backup is the first step. The challenge is to include the large files in the data recovery process and make sure the pointers in the git repository point to the right file when recovered.
Last year we started implementing an ISMS. This basically meant formalizing the security practices we already had applied. This can be very useful, especially when bringing new people on board. It systematically improves organisational security as well.
We’ve already put into practice most of the policies and procedures that come with the ISMS, with the audit scheduled for 2020-Q2.