Google Compute Engine does not allow outbound connections on ports 25, 465 and 587. These SMTP ports are blocked by default due to abuse.
For sending emails through the Google cloud compute instance Google recommends the following:
Relay emails through your G suite(formerly known as Google Apps) account
Relay using third party email service such as Sendgrid, Mailgun or Mailjet.
Connect your instance to your network via VPN and use your network to send email
With setting up the relay service we would need to configure and install Postfix. We won’t be installing and configuring Postfix on this tutorial. Instead we would be utilizing a WordPress WP Mail Plugin that would allow us to use a Gmail SMTP to send emails. I recommend setting up a new gmail account just for this purpose.
I’ve started the new certification path for the Platform Developer II as I already passed the multiple choice question a long time ago but it was ridiculously hard getting a slot on the programming assignment. With this new format, I would need to complete these four superbadges.
So, 1 out 4 completed. I got the Apex Specialist Superbadge nailed.
For the Apex Specialist superbadge. There are a total of 6 mini challenges you need to complete. Requirements are straightforward. If you been developing on the platform for some time, the challenge should be a breeze. It took me 6 hours though to coymplete the challenge.
Takeaways from the challenge
There are many options for building the solution. Just stick with where you are most familiar with.
Use maps for more efficient querying and to bulkify your triggers
I thought I already know all my JSON stuffs until I tried to manually parse a JSON string. Get familiar with using the JSON class methods. Check my post on Demystifiyng JSON parsing in Apex.
Given another chance since the requirements are not that complicated this would’ve been a good practice to have tried test-driven development.
If you need help completing the challenge hit the comments below and I’ll be glad to help. Good luck on getting that superbadge.
You got back a JSON string either as a response from a REST service, passed through from your visual force page via remote action or passed from your lighting component to your apex controller, now how do you parse this information? For simple JSON structures Salesforce can automatically parse them for you. There are tools online that would generate apex classes from the JSON string that would assist in parsing the JSON input.
It is good to get back to the basics and understand a little bit how the JSON string is structured and how the parsing is done.
While coming up with a solution design for an API integration piece for work I found the REST API batch resource feature relevant to my use case. The feature has been around since Summer 2015 and I didn’t know much about it then. Basically the batch resource allows you to make multiple request in a single API call.
I needed to come up with a simple solution for creating a single record and has option to pass an array of records to be created. I initially had been looking at 3 options and listed the things to consider on the development side.
REST API via the sobject resource
use sobject resource for record creation
no Apex code
cannot do array
Apex Rest API
accepts an array of the records
use of JSON parser and deserializing the request body
create test classes
overkill for the job and too many things to consider to get the integration setup
I’ve been doing a small bit of Google Cloud Compute(GCP) configurations to run this blog. I thought I’ll start sharing some of the steps I’ve gone through which could be useful to others out there. So I’ve setup a GCP instance and configured it with a WordPress Bitnami stack. I’ve got some videos of those in this playlist if you want to follow along.
But on this post I would be discussing about accessing phpMyAdmin on GCP via an SSH Tunnel on your Mac.
In order to access phpMyadmin on a WordPress Bitnami stack from Google Cloud Compute you need to do this using an SSH tunnel. On the WordPress Bitnami stack, phpMyAdmin is blocked from the public and only accessible from the localhost. This is where SSH tunnel comes in. Basically you access the localhost on a particular port on your computer that forwards that information to the remote server on an encrypted channel and then the remote server sends the content back to the local computer.
Here is an illustration on how that access is provided via the SSH tunnel.
Another example of using SSH tunnel is when a website is blocked from your company firewall or proxy filter, you can use SSH tunnel to bypass the proxy and connect to a remote computer that has no restriction and can access the blocked website.
With that said, lets connect via SSH tunnel on your Mac.
Open the terminal and change directory to where your private key is located
Type in following, replace the private key name, user and IP to match yours. Enter passphrase when prompted.