5 Steps to Installing WordPress with LEMP on Alibaba Cloud

This article was created in partnership with Alibaba Cloud. Thank you for supporting the partners who make SitePoint possible.

WordPress is one of the most popular content management systems (CMS) in the market, owing to the fact that it is completely free to use and extremely versatile. Typically, you would deploy WordPress on a LAMP stack, regardless of the type of website, as WordPress is based on MySQL and PHP.

In this article however, we will discover how to deploy a WordPress website on a LEMP stack. We have chosen NGINX as our web server instead of the more popular Apache because it is much faster, more scalable and secure for a single site.

By completing the steps in this article, you should expect a fresh WordPress site running on an Alibaba Cloud Elastic Compute Service (ECS) instance, running a highly performant LEMP Stack.

This article assumes that you already have an account on Alibaba Cloud. If you don’t have one yet, just sign up and enjoy the $300 new user free credit.

Step 1: Provision and Secure an Ubuntu 16.04 Server

In the first step, we will create an Alibaba Cloud ECS instance running Ubuntu 16.04. This step is pretty straightforward, it mostly involves purchasing an ECS instance from the Alibaba Cloud console, selecting your preferred configurations, and running some Linux commands.

However, to improve security, we will also be securing the server by creating a superuser and disabling root login. We will then configure the Ubuntu UFW (Uncomplicated FireWall) on the server.

Read the detailed steps here.

This provides a versatile foundation upon which we will complete our LEMP stack, which we will do in the next step.

Step 2: Completing the LEMP Stack

In this step, we will complete a modern LEMP stack for our WordPress website. Specifically, we will be setting up our LEMP stack with a highly-performant combination of NGINX, MariaDB, and PHP7. This setup is widely regarded as the optimum foundation for a modern WordPress site.

Read the detailed steps here.

At present, we can only visit our site by entering the server IP address in a web browser, so we will fix that in the next step by configuring a domain name to serve our site, and securing everything with an SSL certificate. After that, we will move on to installing WordPress on our LEMP stack.

Step 3: Configuring a Domain and Let’s Encrypt SSL

In the third step, we will run through Domain Management for our site, adding DNS records, and installing a Let’s Encrypt SSL certificate. In the previous two steps, we provisioned and secured our Ubuntu 16.04 server installation, and then completed the LEMP stack with NGINX, MariaDB, and PHP7. However, we will need to configure a domain to make our server publicly accessible. Additionally, we will secure our WordPress site with an SSL certificate so it can only be accessed via HTTPS.

Read the detailed steps here.

Your ECS instance should now have a secured and highly performant LEMP stack installed, with a domain configured, and a secure SSL certificate protecting visitors’ information.

Step 4: Installing WordPress on Your Alibaba Cloud ECS Instance

In the fourth step, we will be installing WordPress via both by using wget/curl and the WordPress Command Line (WP-CLI). This is where you have a decision to make, as this tutorial is going to show you two different ways to install WordPress.

Continue reading %5 Steps to Installing WordPress with LEMP on Alibaba Cloud%


Source: Sitepoint

15 Bootstrap Tools and Playgrounds

In this post, we’ve scoured the web for Bootstrap tools and playgrounds and shared with you only the very best out there.

Web designers and developers operate in a great industry. Our expertise and access to affordable development resources gives us the ability to do something unique — something which is found in few (if any) industries: the ability to release tools for other web designers and developers.

Tools for people like us are plentiful. Many of them are free, some are paid. All of them are awesome.

There are tools and playgrounds for almost everything — including Bootstrap. Let’s review the best of them.

1. Pingendo

Pingendo

Price: Free for non-commercial use or $99 one-time payment

Pingendo is a Bootstrap 4 builder which is available in two flavors, an online playground and a desktop version available for Windows, macOS and Linux.

Pingendo comes with quite a nice selection of templates to get you really bootstrapped with your web design. Amongst the available templates, you’ll find an App intro site, a Conference site template and a Restaurant template, which comes in various themes.

There’s also a number of wireframes ready for use, including a photo album, a cover page, a checkout form page, a landing page, a product page and a pricing table.

2. Brix

Brix

Price: from $14.90/month

Brix is a Bootstrap builder for designing, creating and editing responsive websites and UIs. The service is completely cloud-based, built as a rapid prototyping tool for the Bootstrap framework.

The tool derives from experience gathered since the beginning of Bootstrap.

More than 20 templates are also available to use as a starting point for your web pages.

3. Jetstrap

Jetstrap

Price: from $16/month for 3 projects

Jetstrap is a Bootstrap Interface builder which is a cross between a mockup tool and an interface-building tool, bringing a bit of both to the table. Actually, the great thing is that rather than mocking up your screens, you’re actually building them on the fly.

The tool is fully web-based and includes drag-and-drop components and snippets of good clean markup ready for creating complicated components easily.

4. Pinegrow

Pinegrow

Price: from $49

Pinegrow is a desktop web editor that allows you to build responsive sites using live, multi-page editing, CSS and Sass styling, and components for Bootstrap, Foundation and WordPress.

Available for macOS, Windows and Linux, you can develop using Bootstrap 3, 4, or other frameworks as you prefer.

5. Bootstrap Studio

Bootstrap Studio

Price: from $25

Bootstrap Studio is a desktop app, but it does offer an online demo of its capabilities.

It’s built around drag-and-drop functionality and comes with quite a good set of built-in components, including headers, footers, galleries, and slideshows.

6. Bootply

Bootply

Price: Free with ads, $9/month

Bootply touts itself as a Bootstrap playground, editor and builder.

Of all the tools we’ve seen so far, this seems like the one which is most suited for those who like having the power of drag and drop but with the full ability of coding at hand. It allows you to switch between the Code Editor and the Preview so you can quickly check your build.

Bootply also comes with a number of pre-build starter templates to get you up and running quickly. Besides your run-of-the-mill landing page, single page app or article, you’ve also got more complex templates such as the Control Panel and the Dashboard templates and a modern layout for a tech news site.

7. BootMetro

BootMetro

Price: Free

This is a simple UI framework which allows you to create a Metro-like interface using Bootstrap.

Continue reading %15 Bootstrap Tools and Playgrounds%


Source: Sitepoint

How the Lightning Network Helps Blockchains Scale

This introduction to the Lightning Network was originally published at Bruno’s Bitfalls website, and is reproduced here with permission.

Bitcoin is currently impractical to use because of slow and expensive transactions plaguing its blockchain. Most people use it as a store of value (the digital gold fallacy) or to trade on exchanges. A concept known as the Lightning Network was introduced as a solution to this scalability issue.

The Basics of the Lightning Network

The Lightning Network was first described in a 2015 whitepaper by Joseph Poon and Thaddeus Dryja. The concept, however, was actually introduced by Satoshi Nakamoto in an email to Mike Hearn in 2013.

The Lightning Network works through payment channels which are actually multi-sig wallets (multiple signature). A multi-sig wallet is just a Bitcoin address which requires a signature or private keys of several owners before money in that address can be spent. You can view them as bank vaults which require the turning of two different keys at the same time in order to open.

A vault with two keys

A multi-sig wallet can, for example, be the common Bitcoin address of a married couple, wherein they both have to sign a transaction in order to spend their common Bitcoin.

The purpose of payment channels is regular execution of smaller payments and avoidance of high transaction fees. Examples of relationships ideal for LN channels are employee-employer, consumer-producer, utility provider-utility consumer, coffee drinker-coffee shop, etc. The idea is letting a customer open a payment channel with his coffee shop, regularly paying for coffee without having to wait for confirmation (10 to 60 minutes currently).

How the Lightning Network Works

Alice and Bob

Let’s explain with a step-by-step example. Our imaginary scenario is as follows:

Bob wants to pay Alice to write articles for him. The deal is 10 BTC for a total of 100 posts, or 0.1 BTC per post.

In a traditional Bitcoin system, that would require one hour on average with a fee of $5 to $500 per transaction, depending on how backlogged the network is. Because both Alice and Bob are bitcoin maximalists, they chose to open an LN channel rather than go with a cheaper and easier to use altcoin.

Step 1: Opening a Channel

Bob sends an initial channel-opening transaction

Bob creates a regular Bitcoin transaction on the main chain which defines the following:

  • who he’s opening the channel with
  • how much BTC he’s sending into the channel (10 BTC)
  • after how long a period of time (one week in this case) he has the right to take the 10 BTC back if Alice does not respond.

The latter is actually a sub-transaction in the main transaction with a “timelock” function, which makes sure that, in spite of both parties having confirmed it, the money is not moveable for a week.

So, Bob sends Alice two transactions — one in which he suggests opening a payment channel with a 10 BTC deposit on a multi-sig which is opened with that transaction, and one in which he says the 10 BTC go back to him if there’s been no activity in the channel for a week.

Step 2: Accepting the Opening of a Channel

Alice accepts and signs

Alice receives two transactions in which she can see that Bob is offering 10 BTC on the multi-sig address with the two of them as the participating parties. She can also see he’s added the condition to return the money to him after one week of inactivity. She accepts this and signs the transactions, after which she broadcasts the transactions and they’re sent to the main blockchain, finalizing the channel’s creation.

Signed transactions waiting

It’s important to define two concepts here: signing a transaction and confirming or broadcasting a transaction. A signed transaction is merely ready for sending to the blockchain and constitutes an agreement between parties. It’s not visible on the blockchain. A broadcast or confirmed transaction is sent to the blockchain and closes the payment channel, settling balances.

Signing the first transaction opens the channel and causes Bob’s 10 BTC to be deposited into the multi-sig address. Signing the other, despite allowing Bob to take all 10 BTC back, can only become active after a week.

Channel is open, transactions are in the blockchain

Alice and Bob now have a week to do the first bit of business.

Continue reading %How the Lightning Network Helps Blockchains Scale%


Source: Sitepoint

The Future Is Here! Augmented And Virtual Reality Icon Set

The Future Is Here! Augmented And Virtual Reality Icon Set

The Future Is Here! Augmented And Virtual Reality Icon Set

The Smashing Editorial

2018-05-18T15:45:28+02:00
2018-05-18T14:23:07+00:00

What once sounded like science fiction has become a reality: All you need is to grab a VR headset or simply use your web browser and you suddenly find yourself in an entirely different place, a different time, or in the middle of your favorite game.

Augmented and virtual reality are changing the way we experience and interact with the world around us — from the way we consume media and shop to the way we communicate and learn. Careless of whether you’re skeptical of this evolution or just can’t wait to fully immerse yourself in virtual worlds, one thing is for sure: Exciting times are ahead of us.


Augmented And Virtual Reality Icon Set

To share their excitement about AR and VR, the creative folks at Vexels have designed a free set of 33 icons that take you on a journey through the new technology as well as the worlds it encompasses. The set includes useful icons of devices but also cute, cartoonish illustrations of people interacting with them. All icons are available in four formats (PNG, EPS, AI, and SVG) so you can resize and customize them until they match your project’s visual style perfectly. Happy exploring!

Further Freebies on SmashingMag:

Please Give Credit Where Credit Is Due

This set is released under a Creative Commons Attribution 3.0 Unported, i.e. you may modify the size, color and shape of the icons. Attribution is required, so if you would like to spread the word in blog posts or anywhere else, please do remember to credit Vexels as well as provide a link to this article.

Here’s a sneak peek of some of the icons:

Full Preview Of The Icon Set








Source: Smashing Magazine

How to Make a Real-Time Sports Application Using Node.js

Final product image
What You’ll Be Creating

In today’s article
I’m going to demonstrate how to make a web application that will display live
game scores from the NHL. The scores will update automatically as the games
progress.

This is a very
exciting article for me, as it allows me the chance to bring two of my favorite
passions together: development and sports.

The technologies
that will be used to create the application are:

  1. Node.js
  2. Socket.io
  3. MySportsFeed.com

If you don’t have
Node.js installed, visit their download
page
now and set it up before continuing.

What Is Socket.io?

Socket.io is a
technology that connects a client to a server. In this example, the client is a
web browser and the server is the Node.js application. The server can have
multiple clients connected to it at any given time.

Once the connection
has been established, the server can send messages to all of the clients or an
individual client. In return, the client can send a message to the server, allowing for bi-directional real-time communication.

Before Socket.io,
web applications would commonly use AJAX, and both the client and server would
poll each other looking for events. For example, every 10 seconds an AJAX call
would occur to see if there were any messages to handle.

Polling for messages
caused a significant amount of overhead on both the client and server as it
would be constantly looking for messages when there were none.

With Socket.io, messages are received instantaneously, without needing to look for messages, reducing the overhead.

Sample
Socket.io Application

Before we consume
the real-time sports data, let’s create an example application to demonstrate
how Socket.io works.

To begin, I am going
to create a new Node.js application. In a console window, I am going to
navigate to C:GitHubNodeJS, create a new folder for my application, and
create a new application:

I used all the
default settings.

Because we are
making a web application, I’m going use an NPM package called Express to
simplify the setup. In a command prompt, install it as follows: npm install express
--save

And of course we
will need to install the Socket.io package: npm install
socket.io --save

Let’s begin by
creating the web server. Create a new file called index.js and place the
following code within it to create the web server using Express:

If you are not
familiar with Express, the above code example includes the Express library and
creates a new HTTP server. In this example, the HTTP server is listening on
port 3000, e.g. http://localhost:3000. A
route is created at the root of the site “/”. The result of the route
returns an HTML file: index.html.

Before we create the
index.html file, let’s finish the server by setting up Socket.io. Append the
following to your index.js file to create the Socket server:

Similar to Express,
the code begins by importing the Socket.io library. This is stored in a
variable called io. Next, using the io variable, an event handler is created with
the on function. The event being
listened for is connection. This event
is called each time a client connects to the server.

Let’s now create our
very basic client. Create a new file called index.html and place the following
code within:

The HTML above loads
the Socket.io client JavaScript and initializes a connection to the server. To
see the example, start your Node application: node index.js

Then, in your browser, navigate to http://localhost:3000. Nothing
will appear on the page; however, if you look at the console where the Node
application is running, two messages are logged:

  1. HTTP server started on port
    3000
  2. Client connection received

Now that we have a
successful socket connection, let’s put it to use. Let’s begin by sending a
message from the server to the client. Then, when the client receives the
message, it can send a response back to the server.

Let’s look at the
abbreviated index.js file:

The previous io.on function has been updated to include a
few new lines of code. The first, socket.emit, sends the message to the client. The sendToClient
is the name of the event. By naming events, you can send different types of
messages so the client can interpret them differently. The second addition is
the socket.on, which also contains an
event name: receivedFromClient. This
creates a function that accepts data from the client. In this case, the data is
logged to the console window.

That completes the
server-side amendments; it can now send and receive data from any connected
clients.

Let’s complete this
example by updating the client to receive the sendToClient
event. When it receives the event, it can respond with the receivedFromClient event back to the server.

This is accomplished
in the JavaScript portion of the HTML, so in the index.html
file, I have updated the JavaScript as follows:

Using the
instantiated socket variable, we have
very similar logic on the server with a socket.on
function. For the client, it is listening to the sendToClient
event. As soon as the client is connected, the server sends this message. When
the client receives it, it is logged to the console in the browser. The client
then uses the same socket.emit that the
server used to send the original event. In this instance, the client sends back
the receivedFromClient event to the
server. When the server receives the message, it is logged to the console
window.

Try it out for
yourself. First, in a console, run your Node application: node index.js. Then load http://localhost:3000 in your browser.

Check the web
browser console and you should see the following JSON data logged: {hello:
"world"}

Then, in the command
prompt where the Node application is running, you should see the following:

Both the client and
server can use the JSON data received to perform specific tasks. We will learn
more about that once we connect to the real-time sports data.

Sports
Data

Now that we have
mastered how to send and receive data to and from the client and server, this
can be leveraged to provide real-time updates. I chose to use sports data,
although the same theory is not limited to sports. Before I began this project,
I researched different sports data. The one I settled on, because they offer
free developer accounts, was MySportsFeeds (I am not affiliated with them in any way). To access the real-time
data, I signed up for an account and then made a small donation. Donations start at
$1 to have data updated every 10 minutes. This will be good for the example.

Once your account is
set up, you can proceed to setting up access to their API. To assist with this,
I am going to use their NPM package: npm install
mysportsfeeds-node --save

After the package
has been installed, API calls can be made as follows:

In the example
above, be sure to replace the call to the authenticate
function with your username and password.

The following code
executes an API call to the get the NHL scoreboard for today. The fordate variable is what specifies today. I’ve
also set force to true so that a response is always returned, even when the data has not changed.

With the current
setup, the results of the API call get written to a text file. In the final
example, this will be changed; however, for demonstration purposes, the results
file can be reviewed in a text editor to understand the contents of the
response. The results contain a scoreboard
object. This object contains an array called gameScore.
This object stores the result of each game. Each object contains a child object
called game. This object provides the
information about who is playing.

Outside of the game
object, there are a handful of variables that provide the current state of the
game. The data changes based on the state of the game. For example, when the
game hasn’t started, there are only a few variables that tell us the game is not
in progress and has not started.

When the game is in
progress, additional data is provided about the score, what period the game is
in, and how much time is remaining. We will see this in action when we get to
the HTML to show the game in the next section.

Real-Time Updates

We have all the
pieces to the puzzle, so it is now time to put the puzzle together to reveal the
final picture. Currently, MySportsFeeds has limited support for pushing data to
us, so we will have to poll the data from them. Luckily, we know the data only
changes once every 10 minutes, so we don’t need to add overhead by polling for
changes too frequently. Once we poll the data from them, we can push those
updates from the server to all clients connected.

To perform the
polling, I will use the JavaScript setInterval
function to call the API (in my case) every 10 minutes to look for updates.
When the data is received, an event is sent to all of the connected clients.
When the clients receive the event, the game scores will be updated with
JavaScript in the web browser.

MySportsFeeds will
also be called when the Node application first starts up. This data will be
used for any clients who connect before the first 10-minute interval. This is
stored in a global variable. This same global variable is updated as part of
the interval polling. This will ensure that when any new clients connect after
the polling, they will have the latest data.

To assist with some
code cleanliness in the main index.js
file, I have created a new file called data.js.
This file will contain a function that is exported (available in the index.js file) that performs the previous call
to the MySportsFeeds API. Here are the full contents of that file:

A getData function is exported and returns the
result of the call, which in this case is a Promise
that will be resolved in the main application.

Now let’s look at
the final contents of the index.js file:

The first seven lines of
code above instantiate the required libraries and the global latestData variable. The final list of
libraries used are: Express, Http Server created with Express, Socket.io, and
the aforementioned data.js file just
created.

With the necessities
taken care of, the application populates the latestData
for clients who will connect when the server is first started:

The next few lines
set up a route for the root page of the website (http://localhost:3000/) and start the HTTP
server to listen on port 3000.

Next, the Socket.io
is set up to look for connections. When a new connection is received, the server
emits an event called data with the
contents of the latestData variable.

And finally, the
final chunk of code creates the polling interval. When the interval occurs, the
latestData variable is updated with the
results of the API call. This data then emits the same data event to all clients.

You may notice that
when the client connects and an event is emitted, it is emitting the event with
the socket variable. This approach will
send the event to that connected client only. Inside the interval, the global io is used to emit the event. This will send
the event to all clients.

That completes the
server. Let’s work on the client front-end. In an earlier example, I created a basic index.html file that
set up the client connection that would log events from the server and send one
back. I am going to extend that file to contain the completed example.

Because the server
is sending us a JSON object, I am going to use jQuery and leverage a jQuery
extension called JsRender.
This is a templating library. It will allow me to create a template with HTML
that will be used to display the contents of each NHL game in an easy-to-use, consistent manner. In a moment, you will see the power of this library. The
final code is over 40 lines of code, so I am going to break it down into
smaller chunks, and then display the full HTML together at the end.

This first part
creates the template that will be used to show the game data:

The template is
defined using a script tag. It contains
the id of the template and a special script type called text/x-jsrender. The
template defines a container div for
each game that contains a class game to
apply some basic styling. Inside this div, the templating begins.

In the next div, the
away and home team are displayed. This is done by concatenating the city and
team name together from the game object
from the MySportsFeed data.

{{:game.awayTeam.City}} is
how I define an object that will be replaced with a physical value when the
template is rendered. This syntax is defined by the JsRender library.

Once the teams are
displayed, the next chunk of code does some conditional logic. When the game is
unPlayed, a string will be outputted
that the game will start at {{:game.time}}.

When the game is not
completed, the current score is displayed: Current Score: {{:awayScore}} -
{{:homeScore}}
. And finally, some tricky little logic to identify what period
the hockey game is in or if it is in intermission.

If the variable currentIntermission is provided in the
results, then I use a function I defined called ordinal_suffix_of, which will convert the period number to read: 1st (2nd, 3rd, etc.)
Intermission.

When it is not in
intermission, I look for the currentPeriod
value. This also uses the ordinal_suffix_of  to show that the game is in the 1st (2nd, 3rd,
etc.) period.

Beneath this,
another function I defined called time_left
is used to convert the number of seconds remaining into the number of
minutes and seconds remaining in the period. For example: 10:12.

The final part of
the code displays the final score because we know the game has completed.

Here is an example
of what it looks like when there is a mix of finished games, in progress games,
and games that have not started yet (I’m not a very good designer, so it looks
as you would expect when a developer makes their own User Interface).

An example of finished games

Next up is a chunk
of JavaScript that creates the socket, the helper functions ordinal_suffix_of
and time_left, and a variable that references the jQuery template created.

The final piece of
code is the code to receive the socket event and render the template:

I have a placeholder
div with the id of data. The result of
the template rendering (tmpl.render) writes the HTML to this container. What is
really neat is that the JsRender library can accept an array of data, in this
case data.scoreboard.gameScore, that
iterates through each element in the array and creates one game per element.

Here is the final
HTML and JavaScript all together:

Start the Node
application and browse to http://localhost:3000
to see the results for yourself!

Every X minutes, the
server will send an event to the client. The client will redraw the game
elements with the updated data. So when you leave the site open and
periodically look at it, you will see the game data refresh when games are
currently in progress.

Conclusion

The final product
uses Socket.io to create a server that clients connect to. The server fetches
data and sends it to the client. When the client receives the data, it can
seamlessly update the display. This reduces load on the server because the
client only performs work when it receives an event from the server.

Sockets are not
limited to one direction; the client can also send messages to the server. When
the server receives the message, it can perform some processing.

Chat applications
would commonly work this way. The server would receive a message from the
client and then broadcast to all connected clients to show that someone has
sent a new message.

Hopefully you
enjoyed this article as I had a blast creating this real-time sports
application for one of my favorite sports!


Source: Nettuts Web Development

Monthly Web Development Update 5/2018: Browser Performance, Iteration Zero, And Web Authentication

Monthly Web Development Update 5/2018: Browser Performance, Iteration Zero, And Web Authentication

Monthly Web Development Update 5/2018: Browser Performance, Iteration Zero, And Web Authentication

Anselm Hannemann

2018-05-18T13:51:17+02:00
2018-05-18T14:23:07+00:00

As developers, we often talk about performance and request browsers to render things faster. But when they finally do, we demand even more performance.

Alex Russel from the Chrome team now shared some thoughts on developers abusing browser performance and explains why websites are still slow even though browsers reinvented themselves with incredibly fast rendering engines. This is in line with an article by Oliver Williams in which he states that we’re focusing on the wrong things, and instead of delivering the fastest solutions for slower machines and browsers, we’re serving even bigger bundles with polyfills and transpiled code to every browser.

It certainly isn’t easy to break out of this pattern and keep bundle size to a minimum in the interest of the user, but we have the technologies to achieve that. So let’s explore non-traditional ways and think about the actual user experience more often — before defining a project workflow instead of afterward.

Front-End Performance Checklist 2018

To help you cater for fast and smooth experiences, Vitaly Friedman summarized everything you need to know to optimize your site’s performance in one handy checklist. Read more →

News

General

  • Oliver Williams wrote about how important it is that we rethink how we’re building websites and implement “progressive enhancement” to make the web work great for everyone. After all, it’s us who make the experience worse for our users when blindly transpiling all our ECMAScript code or serving tons of JavaScript polyfills to those who already use slow machines and old software.
  • Ian Feather reveals that around 1% of all requests for JavaScript on BuzzFeed time out. That’s about 13 million requests per month. A good reminder of how important it is to provide a solid fallback, progressive enhancement, and workarounds.
  • The new GDPR (General Data Protection Regulation) directive is coming very soon, and while our inboxes are full of privacy policy updates, one thing that’s still very unclear is which services can already provide so-called DPAs (Data Processing Agreements). Joschi Kuphal collects services that offer a DPA, so that we can easily look them up and see how we can obtain a copy in order to continue using their services. You can help by contributing to this resource via Pull Requests.

UI/UX

Product design principles
How to create a consistent, harmonious user experience when designing product cards? Mei Zhang shares some valuable tips. (Image credit)

Security

Privacy

  • The GDPR Checklist is another helpful resource for people to check whether a website is compliant with the upcoming EU directive.
  • Bloomberg published a story about the open-source privacy-protection project pi-hole, why it exists and what it wants to achieve. I use the software daily to keep my entire home and work network tracking-free.
GDPR Compliance Checklist
Achieving GDPR Compliance shouldn’t be a struggle. The GDPR Compliance Checklist helps you see clearer. (Image credit)


Source: Smashing Magazine

Distributed App Deployment with Kubernetes & MongoDB Atlas

This article was originally published on mongoDB. Thank you for supporting the partners who make SitePoint possible.

Storytelling is one of the parts of being a Developer Advocate that I enjoy. Sometimes the stories are about the special moments when the team comes together to keep a system running or build it faster. But there are less than glorious tales to be told about the software deployments I’ve been involved in. And for situations where we needed to deploy several times a day, now we are talking nightmares.

For some time, I worked at a company that believed that deploying to production several times a day was ideal for project velocity. Our team was working to ensure that advertising software across our media platform was always being updated and released. One of the issues was a lack of real automation in the process of applying new code to our application servers.

What both ops and development teams had in common was a desire for improved ease and agility around application and configuration deployments. In this article, I’ll present some of my experiences and cover how MongoDB Atlas and Kubernetes can be leveraged together to simplify the process of deploying and managing applications and their underlying dependencies.

Let’s talk about how a typical software deployment unfolded:

  1. The developer would send in a ticket asking for the deployment
  2. The developer and I would agree upon a time to deploy the latest software revision
  3. We would modify an existing bash script with the appropriate git repository version info
  4. We’d need to manually back up the old deployment
  5. We’d need to manually create a backup of our current database
  6. We’d watch the bash script perform this “Deploy” on about six servers in parallel
  7. Wave a dead chicken over my keyboard

Some of these deployments would fail, requiring a return to the previous version of the application code. This process to “rollback” to a prior version would involve me manually copying the repository to the older version, performing manual database restores, and finally confirming with the team that used this system that all was working properly. It was a real mess and I really wasn’t in a position to change it.

I eventually moved into a position which gave me greater visibility into what other teams of developers, specifically those in the open source space, were doing for software deployments. I noticed that — surprise! — people were no longer interested in doing the same work over and over again.

Developers and their supporting ops teams have been given keys to a whole new world in the last few years by utilizing containers and automation platforms. Rather than doing manual work required to produce the environment that your app will live in, you can deploy applications quickly thanks to tools like Kubernetes.

What’s Kubernetes?

Kubernetes is an open-source system for automating deployment, scaling, and management of containerized applications. Kubernetes can help reduce the amount of work your team will have to do when deploying your application. Along with MongoDB Atlas, you can build scalable and resilient applications that stand up to high traffic or can easily be scaled down to reduce costs. Kubernetes runs just about anywhere and can use almost any infrastructure. If you’re using a public cloud, a hybrid cloud or even a bare metal solution, you can leverage Kubernetes to quickly deploy and scale your applications.

The Google Kubernetes Engine is built into the Google Cloud Platform and helps you quickly deploy your containerized applications.

For the purposes of this tutorial, I will upload our image to GCP and then deploy to a Kubernetes cluster so I can quickly scale up or down our application as needed. When I create new versions of our app or make incremental changes, I can simply create a new image and deploy again with Kubernetes.

Why Atlas with Kubernetes?

By using these tools together for your MongoDB Application, you can quickly produce and deploy applications without worrying much about infrastructure management. Atlas provides you with a persistent data-store for your application data without the need to manage the actual database software, replication, upgrades, or monitoring. All of these features are delivered out of the box, allowing you to build and then deploy quickly.

In this tutorial, I will build a MongoDB Atlas cluster where our data will live for a simple Node.js application. I will then turn the app and configuration data for Atlas into a container-ready image with Docker.

MongoDB Atlas is available across most regions on GCP so no matter where your application lives, you can keep your data close by (or distributed) across the cloud.

Figure 1: MongoDB Atlas runs in most GCP regions

Requirements

To follow along with this tutorial, users will need some of the following requirements to get started:

First, I will download the repository for the code I will use. In this case, it’s a basic record keeping app using MongoDB, Express, React, and Node (MERN).

Continue reading %Distributed App Deployment with Kubernetes & MongoDB Atlas%


Source: Sitepoint

How to Use Service Workers to Communicate Across Browser Tabs

This interview has been republished from Versioning, SitePoint’s daily subscription newsletter helping developers stay up-to-date and knowledgeable by offering curated links to the essentials in front-end, back-end, design and UX, news, business and more. Learn more and sign up here.

Sign up to Versioning

Tim Evko

Tim Evko is a front-end developer, managing a primarily React codebase, and team lead at ExecThread, a company focused on surfacing job openings to professionals. When not at ExecThread, he spends the rest of his time at a local gym, working on being a better competitive Crossfit Athlete.

Which tech idea or trend excites you the most at the moment?

Lately I’ve been fascinated with Service Worker technology, especially when used to help make sites load fast and become interactive quickly. I love performance and web-app-offline capability, and I think the community can benefit from the increased focus on building resilient, versatile, and fast applications.

Service Workers are especially fun to work with because of how much you can do with them. I recently learned that you can use a Service Worker to communicate across tabs in a browser, allowing each individual tab to reload itself if the cache is stale. It’s not an easy technology to work with but I’m very glad that it’s around!

Continue reading %How to Use Service Workers to Communicate Across Browser Tabs%


Source: Sitepoint

The CSS Grid Layout vs CSS Frameworks Debate

With cutting-edge CSS standards like CSS Grid Layout and Flexbox, coding a webpage layout is no longer such a pain.

If you add to this that browser support for both Grid and Flexbox is pretty good too, then the question is bound to come up: Why should I learn and use a CSS framework in my development work?

In this article I will focus on Bootstrap, since it’s arguably one of the most popular among all the CSS UI libraries available out there.

In my view, there are still a number of reasons why it does still make sense to learn and use Bootstrap today.

Here’s a few of them for you.

What Is CSS Grid?

Rachel Andrew, a well-known speaker and writer on all things CSS Grid-related, defines it as follows:

Grid is a grid system. It allows you to define columns and rows in your CSS, without needing to define them in markup. You don’t need a tool that helps you make it look like you have a grid, you actually have a grid!

The implementation of this CSS standard gives developers the much needed ability to build page layouts with native CSS code, with no dependency from the HTML markup except for the presence of a wrapper element that works as your containing grid. Just imagine the flexibility and the potential for creativity in web design!

For instance, you don’t need custom classes or extra rows in your markup to build this simple layout:

Simple webpage layout built with CSS Grid.

Here’s the HTML:

[code language=”html”]
<div class=”grid”>
<header>Header content</header>
<main>Main content</main>
<aside>Sidebar</aside>
<footer>Footer</footer>
</div>
[/code]

As for the CSS, this is where you’re going to build your visual layout. All it takes in this simple case is a few lines of code:

[code language=”css”]
.grid {
display: grid;
grid-template-columns: repeat(12, 1fr);
grid-template-rows: 50px 150px 50px;
}

header, footer {
grid-column: span 12;
}

main {
grid-column: span 8;
}

aside {
grid-column: span 4;
}
[/code]

That’s it, you’re done! Not bad.

What Is Bootstrap?

At the time of this writing, 3.6% of the entire Internet uses Bootstrap:

Bootstrap usage.

On the Bootstrap’s website, you’ll find this definition:

Bootstrap is an open source toolkit for developing with HTML, CSS, and JS. Quickly prototype your ideas or build your entire app with our Sass variables and mixins, responsive grid system, extensive prebuilt components, and powerful plugins built on jQuery.

In other words, Bootstrap gives you ready-made components that allow you to whip up a beautiful webpage in no time.

Just write the appropriate markup, and your app looks great out of the box.

Bootstrap makes it also super easy to customize its look & feel to your liking and lets you pick and choose the bits you need for your project.

Why Bootstrap When We Have Grid?

The biggest complaint about Bootstrap has always been code bloat. The reasoning was that it included lots of extra CSS code that remained unused in your projects. The second biggest complaint was that Bootstrap components were styled in every detail and this could present some problems when it came to override some CSS rules.

Starting with the latest version of this popular front-end component library, both criticisms fall to pieces: Bootstrap is totally modular, therefore you just include what you need. Also, the Sass files are structured in such a way as to make it very convenient to customize the original styles to your needs.

Today, the main reason against using Bootstrap is the fact that with CSS Grid, CSS has a grid system of its own, which doesn’t have any external dependencies and once learned enables devs to build all sorts of layouts with relative ease.

Although I’m a CSS Grid fan, I think Bootstrap has still its place in front-end development and will have for some time to come.

Here are at least three reasons why.

Continue reading %The CSS Grid Layout vs CSS Frameworks Debate%


Source: Sitepoint

What is a Bitcoin Node? Mining versus Validation

This introduction to Bitcoin Nodes was originally published at Bruno’s Bitfalls website, and is reproduced here with permission.

You’ll often hear the word node thrown around in blockchain and cryptocurrency circles. If you’ve read our intro to blockchain (and we highly recommend you do that), one of the characters in the comic there that’s writing down transactions on a piece of paper is actually a node. That introduction is quite simplified, however — so let’s explain the concept of nodes in a bit more detail now.

Validation Nodes

One node is a computer running specific software. In the case of Bitcoin, one node is a Bitcoin program which connects to other Bitcoin nodes, i.e. other Bitcoin programs on the same machine, or on other machines which can be across the street or on the other side of the planet. There are several types and several versions of Bitcoin software. By picking a specific version of a specific Bitcoin program, a user “votes” for certain changes in the Bitcoin protocol. For example, if a bunch of users suggest the increase of 21 million total BTC to 42 million, the majority of the network is required to vote “yes” by installing the software implementing this change. Code changes are, thus, democratic.

Where this idea falls apart is in the fact that there are very few Bitcoin nodes out there — a mere 10000 currently.

10000 Bitcoin nodes

Continue reading %What is a Bitcoin Node? Mining versus Validation%


Source: Sitepoint