The Complete Electron Pipeline — Development to Rollout

– AWS Command Line InterfaceUse the AWS Command Line Interface (AWS CLI) to control AWS services from the command line and automate them using…docs.



comOnce you’ve got the command line tools working, open the credentials file with nano ~/.

aws/credentials and insert the keys we copied early when creating a new IAM user.

I’ve named my configuration agora-deploy, this is useful if you have multiple projects with separate AWS access on your system.

[agora-deploy]aws_access_key_id=JFRSHDKGYCNFPZMWUTBVaws_secret_access_key=FK84mfPSdRtZgpDqWvb04fDen0fr3DZ87hTYEE1nS3 BucketsNow that you’ve setup your command line, navigate back to AWS console and create a new bucket to hold your distribution binaries.

Just use the default settings when creating the bucket, then navigate through the Permissions and into the Bucket Policy.

Bucket policy permissionsInsert the following policy, granting public access to GetObject’s from the bucket.

Be sure to insert YOUR_BUCKET_ARN followed by the top level directory your opening, I used /* here for all contents.

{ "Version": "2012-10-17", "Statement": [ { "Sid": "PublicReadGetObject", "Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource": "YOUR_BUCKET_ARN/*" } ]}And your set, the buckets now ready to serve your distribution.

You could manually upload the build files, or continue onto the next section were we will look at automating distribution publishing.

Configuring Electron-BuilderWe need to add some additional configuration to our Electron build properties for our new S3 instance.

Add the following publish settings to your package.


"build": { .

"publish": [ { "provider":"s3", "bucket":"YOUR_BUCKET_NAME" } ]}I also updated my scripts with the following two commands.

"scripts": { .

"dist": "electron-builder –publish never –linux –mac", "release": "npm run dist –publish always"}The dist script compiles binaries without publishing and the release script executes the dist script with while overwriting the –publish flag.

If you’ve configured your AWS command line credentials and spelled everything correctly, you should be able to run npm run release to have electron-builder build and publish your distributions.

Content DeliveryNow that we have our application project setup, building and publishing, we can move onto creating our web interface to help user easily find the installer they need.

The source code for this section can be found at the following GitHub repository.

MitchPierias/Agora-StaticStatic Application Binary Delivery Site.

Contribute to MitchPierias/Agora-Static development by creating an account on…github.

comProject SetupI’m going to be building a static site to list the distributions.

For this I’ll be using Gatsby and configuring automatic publishing to S3, and served from CloudFront.

Let’s start by installing Gatsby’s command line interface tools, then run gatsby new followed by our application name to create our project boilerplate.

npm install -g gatsby-cligastby new APP_NAMEI’ll quickly add links to the binaries using the S3 Object URL within the index.

js component.

You can build your site out however you’d like here.

<Link to=”https://s3-ap-southeast-2.





dmg”>Mac</Link><Link to=” https://s3-ap-southeast-2.





AppImage”>Linux</Link>Finally we can run npm run build to compile our static site.

We can move onto configuring our S3 and CloudFront environment to display it.

The output of our build will be in the public folder, the contents of this folder is what we will move to our S3 Bucket.

Please be aware that some AWS services can take up to 24 hours to propagate changes across all edge locations, and be discoverable.

Before we can begin, you’ll need to have a domain name ready.

We could do this with the resource URL’s generated for our instances, but there’s really no reason you’d ever need to do such a thing.

Configuring HTTPSAs a bonus to our users, we’re going to setup SSL so they can trust we really are who we say we are.

Within the AWS console, navigate to the Certificate Manager and Request a certificate.

Follow the prompts to setup, entering your domain or multiple domains, and selecting DNS Validation as the validation method.

Once your certificate has been requested, you’ll need to follow the instructions listed to complete issuance.

It may take up too an hour for Route 53 and Certificate Manager to synchronize and issue the certificate.

S3 HostingWe’re going to need an S3 bucket to hold all our static build files.

Open S3 and create a new bucket, using your domain name as the bucket’s name.

Next open the bucket settings navigate to Properties and configure your Static website hosting like so.

We’re telling our bucket to act as a web server, returning the specified index document when no resource is specified and the error document when a resource isn’t found.

We will set ours as the default files outputted from Gatsby’s build process.

We don’t need to configure our bucket policy yet, we will have CloudFront do that later.

Setting up CloudfrontWe could configure our S3 Bucket to directly serve our content, but then we loose the advantage of SSL encryption and high speed delivery.

Let’s setup a CloudFront instance instead to build an edge cache from our bucket contents and deliver it with SSL.

Open the CloudFront Service within AWS, Create Distribution, and Get Started with a Web distribution.

We have a little bit to configure here so stay patient.

Origin SettingsOrigin Domain Name: YOUR_S3_BUCKET_ARNDefault Cache Behavior SettingsViewer Protocol Policy: Redirect HTTP to HTTPSDistribution SettingsSSL Certificate: Custom SSL Certificate (Select the cert assigned earlier)Default Root Object: index.

htmlYou can setup any additional configuration here you like, like custom headers.

Next select Create Distribution and Amazon will start up the new instance.

This will take some time, go have a coffee while the instance starts and propagates across all edge locations.

We will need the service to be discoverable for the next section, configuring Route 53.

Domain RoutingSo we’ve setup CloudFront to cache and serve our static site from S3, now we need to configure Route53 to provide a user friendly domain for our CloudFront instance.

Navigate over to Route53, select the Hosted Zone associated with your domain, or create one if you haven’t already.

Then Create Record Set, type A — IPv4 address, select Alias Yes and set the Alias Target as your CloudFront instance.

Again, the CloudFront instance could take up too 24 hours to appear across other services.

Once this is setup, you should be able to visit your domain after a few minutes and see your static site appear.

Automating S3 DeploymentWe previous manually copied the entire contents of the build folder into our S3 bucket.

There’s nothing technically wrong with this… but we’re all about automation and making things simpler!.So let’s setup gatsby-plugin-s3 to deploy our build to S3 in just one command.

npm install -s gatsby-plugin-s3Then we want to configure the plugin within our gatsby-config.

js file.

Add the following configuration object into the plugins array:plugins: [ .

{ resolve: 'gatsby-plugin-s3', options: { bucketName: 'agora.


com' } } .

]Finally we’re going to add a deploy command to the package.

json like so:"scripts": { .

"deploy": "gatsby-plugin-s3 deploy" .

}Before you can run this you’ll need to call npm run build to update the plugins in .

cache, then you can run the npm run deploy command, following the prompts to deploy to S3.

Remember to setup your ~/.

aws/credentials with your AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY from the AWS IAM service.

More details on the S3 plugin can be found here.

Thank you!You should now have a full production pipeline setup.

We could also add a GraphQL query to our Gatsby site in order to fetch the latest build resource URL from our deployment S3 Bucket.

If you’d like to visit the live website deployed in this example, it can be found at https://agora.



????.Read this story later in Journal.

????.Wake up every Sunday morning to the week’s most noteworthy Tech stories, opinions, and news waiting in your inbox: Get the noteworthy newsletter >.. More details

Leave a Reply