Recommended Code Setting for WordPress

The following coding below is purely for WordPress – instead of keeping them in the notepad, we published them so that anyone can access them easily.


If you think this website have helped you, a cup of coffee treats will be great. – Treats on Me



pixel - Code

Updated on 25 April 2024

 

 
Fix Sitemap
XML declaration allowed only at the start of the document

 
For SEO
WordPress Ping List for Faster Indexing Of New Post
Setup Robots.txt
Redirect a single page
Redirect an entire site
Responsive SEO Image
Responsive SEO Anchor Text (Hyperlink)
Responsive SEO Anchor Text (Hyperlink) with New Tab
Responsive SEO Anchor Text (Hyperlink) with New Tab and No follow link
Reduce Google Bounce Rate
Display Google Reviews on Website
Schema – Rating Review on Votes

YouTube parameters

Dropping Symbol

Track Website Email Click / Website Phone Click

 
For Woocommerce
Block QTY Cart by Prevent Interaction

Most commonly use Unicode

• Circular Bullet Point
  non-breaking space


Schema – Rating Review on Votes

Displaying rating in Google Search
Insert Before Head

Force SSL with www

Redirect all http non-www and www and https without www to force display https with www
Insert into .htaccess file

# Force HTTPS with www
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule ^(.*)$ https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Force SSL with www – HSTS Supported

Redirect all http non-www and www and https without www to force display https with www
Insert into .htaccess file

# Force HTTPS with www - HSTS Supported
RewriteCond %{HTTPS} !=on
RewriteRule ^(.*)$ https://%{HTTP_HOST}/$1 [R=301,L]
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule ^(.*)$ https://www.%{HTTP_HOST}/$1 [R=301,L,E=HTTPS:1]

Force SSL without www – HSTS Supported

Redirect all http traffic to force display https without www
Insert into .htaccess file

# Force HTTPS without www
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Enable HTTP Strict Transport Security (HSTS)

HTTP Strict Transport Security (HSTS) is a security mechanism in which a website tells the browser that all future requests should be made over HTTPS. Using HSTS will force all future requests to the current domain name to use https:// URLs even if the user attempts to go to links using http:// URLs.
Reference: https://serverpilot.io/docs/how-to-enable-http-strict-transport-security-hsts/ , https://stackoverflow.com/questions/24144552/how-to-set-hsts-header-from-htaccess-only-on-https
Insert into .htaccess file

# Use HTTP Strict Transport Security to force client to use secure connections only
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" env=HTTPS

Website Content Security Policy

Content Security Policy (CSP) is a new layer of security that helps to detect and mitigate certain forms of attacks, including Cross Site Scripting (XSS) and data injection attacks. These attacks are used for everything from data theft to site defacement to distribution of malware.
Reference: https://www.sitepoint.com/content-security-policy-getting-started/ , https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP
Insert into .htaccess file

# Content Security Policy - Apply a CSP to all HTML and PHP files

Header set Content-Security-Policy "policy-definition"

Increase Security with X-Security Headers

Extra Security Headers for “X-XSS-Protection”, “X-Frame-Options” and “X-Content-Type nosniff”, you will pass webpagetest.org on Security score.
Reference: https://htaccessbook.com/increase-security-x-security-headers/
Insert into .htaccess file


# Extra Security Headers - X-XSS-Protection, X-Frame-Options, X-Content-Type nosniff

	Header set X-XSS-Protection "1; mode=block"
	Header always append X-Frame-Options SAMEORIGIN
	Header set X-Content-Type-Options nosniff
	Header set Referrer-Policy "same-origin"
	Header set Feature-Policy "geolocation 'self'; vibrate 'none'"

Enable Keep-Alive

Creating multiple connections may reduce the loading time. It also utilizes many resources on the server.
Reference: https://blog.stackpath.com/glossary/keep-alive/
Insert into .htaccess file

# Enable Keep-Alive

Header set Connection keep-alive

Disable Directory Browsing

Somebody who knows the directory structure of a WordPress installation, may utilize his insight to do some harm. Other than you ought not let them realize what plugins are you utilizing.
Insert into .htaccess file

# disable directory browsing
Options All -Indexes

Force with www

Redirect all http traffic to force display with www
Insert into .htaccess file

# Force HTTP with www
RewriteEngine on
RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301,NC]

or
untested:
# Force HTTP with www
RewriteCond %{HTTPS} on
RewriteRule (.*) http://www.%{HTTP_HOST}%{REQUEST_URI} [L]

Force without www

Redirect all http traffic to force display without www
Insert into .htaccess file

# Force HTTP without www
RewriteCond %{HTTPS} on
RewriteRule (.*) http://%{HTTP_HOST}%{REQUEST_URI} [L]

Enable Gzip Compression

Compression make your website simple and effective way to save bandwidth and speed up your website.
Reference: https://torquemag.io/2016/04/enable-gzip-compression-wordpress/ , https://gtmetrix.com/enable-gzip-compression.html
Insert into .htaccess file


# Compress HTML, CSS, JavaScript, Text, XML and fonts
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/vnd.ms-fontobject
AddOutputFilterByType DEFLATE application/x-font
AddOutputFilterByType DEFLATE application/x-font-opentype
AddOutputFilterByType DEFLATE application/x-font-otf
AddOutputFilterByType DEFLATE application/x-font-truetype
AddOutputFilterByType DEFLATE application/x-font-ttf
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE font/opentype
AddOutputFilterByType DEFLATE font/otf
AddOutputFilterByType DEFLATE font/ttf
AddOutputFilterByType DEFLATE image/svg+xml
AddOutputFilterByType DEFLATE image/x-icon
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/javascript
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/xml

# Remove browser bugs (only needed for really old browsers)
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
Header append Vary User-Agent

Enable Leverage browser caching of static assets

To enable browser caching you need to edit your HTTP headers to set expiry times for certain types of files.
Reference: https://gtmetrix.com/leverage-browser-caching.html
Insert into .htaccess file

# Leverage browser caching of static assets

  ExpiresActive On

  # Images
  ExpiresByType image/jpeg "access plus 1 year"
  ExpiresByType image/gif "access plus 1 year"
  ExpiresByType image/png "access plus 1 year"
  ExpiresByType image/webp "access plus 1 year"
  ExpiresByType image/svg+xml "access plus 1 year"
  ExpiresByType image/x-icon "access plus 1 year"

  # Video
  ExpiresByType video/mp4 "access plus 1 year"
  ExpiresByType video/mpeg "access plus 1 year"

  # CSS, JavaScript
  ExpiresByType text/css "access plus 1 month"
  ExpiresByType text/javascript "access plus 1 month"
  ExpiresByType application/javascript "access plus 1 month"

  # Others
  ExpiresByType application/pdf "access plus 1 month"
  ExpiresByType application/x-shockwave-flash "access plus 1 month"

Increase WordPress Memory Limit

Why WordPress Memory Limit Shows 40MB and can’t Raise? Put the following code above “require_once(ABSPATH . ‘wp-settings.php’);”
Insert into wp-config.php file

define('WP_MEMORY_LIMIT', '128M');

Responsive SEO Image


singapore zoo - Code

Responsive SEO Anchor Text (Hyperlink)


Responsive SEO Anchor Text (Hyperlink) with New Tab


Responsive SEO Anchor Text (Hyperlink) with New Tab and No follow link


Reduce Google Bounce Rate

Apply before <body>




Redirect all visitors except your IP

How to redirect all visitors except your IP to another website? – https://www.siteground.com/kb/how_to_redirect_all_visitors_except_your_ip_to_another_site/
Insert into .htaccess file


ErrorDocument 403 http://www.yourdomain.com/under-maintenance/
Order deny,allow
Deny from all
Allow from xx.xx.xx.xx
Allow from xx.xx.xx.xx

Setup Robots.txt

Control crawl accessing your webpages
Insert into robots.txt file
Reference: https://www.yourdreamtoys.com/robots.txt

########################################################
#
#    yourdomain.com Robots File
#    Last Updated on May 21st, 2020
#
########################################################

User-agent: *
Allow: /wp-admin/admin-ajax.php
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /readme.html
Disallow: /license.txt
Disallow: /xmlrpc.php
Disallow: /wp-login.php
Disallow: /wp-register.php
Disallow: /*?*
Disallow: /*?
Disallow: /*~*
Disallow: /*~
Disallow: */disclaimer/*
Disallow: *?attachment_id=
Disallow: /privacy-policy
Allow: /wp-content/uploads/
Allow: /*/*.css
Allow: /*/*.js
Disallow: /wp-json/
Disallow: /?rest_route=/

Sitemap: http://www.yourdomain.com/sitemap_index.xml

User-agent: Googlebot
Allow: /

User-agent: Googlebot-Image
Allow: /wp-content/uploads/

User-agent: Mediapartners-Google
Allow: /

User-agent: AdsBot-Google
Allow: /

User-agent: AdsBot-Google-Mobile
Allow: /

User-agent: Bingbot
Allow: /

User-agent: Msnbot
Allow: /

User-agent: msnbot-media
Allow: /wp-content/uploads/

User-agent: Applebot
Allow: /

User-agent: Yandex
Allow: /

User-agent: YandexImages
Allow: /wp-content/uploads/

User-agent: Slurp
Allow: /

User-agent: DuckDuckBot
Allow: /

User-agent: Qwantify
Allow: /

# Popular chinese search engines

User-agent: Baiduspider
Allow: /
User-agent: Baiduspider/2.0
Allow: /
User-agent: Baiduspider-video
Allow: /
User-agent: Baiduspider-image
Allow: /
User-agent: Sogou spider
Allow: /
User-agent: Sogou web spider
Allow: /
User-agent: Sosospider
Allow: /
User-agent: Sosospider+
Allow: /
User-agent: Sosospider/2.0
Allow: /
User-agent: yodao
Allow: /
User-agent: youdao
Allow: /
User-agent: YoudaoBot
Allow: /
User-agent: YoudaoBot/1.0
Allow: /

# Spam Backlink Blocker

Disallow: /feed/
Disallow: /feed/$
Disallow: /comments/feed
Disallow: /trackback/
Disallow: */?author=*
Disallow: */author/*
Disallow: /author*
Disallow: /author/
Disallow: */comments$
Disallow: */feed
Disallow: */feed$
Disallow: */trackback
Disallow: */trackback$
Disallow: /?feed=
Disallow: /wp-comments
Disallow: /wp-feed
Disallow: /wp-trackback
Disallow: */replytocom=

# Block Bad Bots.

User-agent: DotBot
Disallow: /

User-agent: GiftGhostBot
Disallow: /

User-agent: Seznam
Disallow: /

User-agent: PaperLiBot
Disallow: /

User-agent: Genieo 
Disallow: /

User-agent: Dataprovider/6.101
Disallow: /

User-agent: DataproviderSiteExplorer
Disallow: /

User-agent: Dazoobot/1.0
Disallow: /

User-agent: Diffbot
Disallow: /

User-agent: DomainStatsBot/1.0
Disallow: /

User-agent: DotBot/1.1
Disallow: /

User-agent: dubaiindex
Disallow: /

User-agent: eCommerceBot
Disallow: /

User-agent: ExpertSearchSpider
Disallow: /

User-agent: Feedbin
Disallow: /

User-agent: Fetch/2.0a
Disallow: /

User-agent: FFbot/1.0
Disallow: /

User-agent: focusbot/1.1
Disallow: /

User-agent: HuaweiSymantecSpider
Disallow: /

User-agent: HuaweiSymantecSpider/1.0
Disallow: /

User-agent: JobdiggerSpider
Disallow: /

User-agent: LemurWebCrawler
Disallow: /

User-agent: LipperheyLinkExplorer
Disallow: /

User-agent: LSSRocketCrawler/1.0
Disallow: /

User-agent: LYT.SRv1.5
Disallow: /

User-agent: MiaDev/0.0.1
Disallow: /

User-agent: Najdi.si/3.1
Disallow: /

User-agent: BountiiBot
Disallow: /

User-agent: Experibot_v1
Disallow: /

User-agent: bixocrawler
Disallow: /

User-agent: bixocrawler TestCrawler
Disallow: /

User-agent: Crawler4j
Disallow: /

User-agent: Crowsnest/0.5
Disallow: /

User-agent: CukBot
Disallow: /

User-agent: Dataprovider/6.92
Disallow: /

User-agent: DBLBot/1.0
Disallow: /

User-agent: Diffbot/0.1
Disallow: /

User-agent: Digg Deeper/v1
Disallow: /

User-agent: discobot/1.0
Disallow: /

User-agent: discobot/1.1
Disallow: /

User-agent: discobot/2.0
Disallow: /

User-agent: discoverybot/2.0
Disallow: /

User-agent: Dlvr.it/1.0
Disallow: /

User-agent: DomainStatsBot/1.0
Disallow: /

User-agent: drupact/0.7
Disallow: /

User-agent: Ezooms/1.0  
Disallow: /

User-agent: fastbot crawler beta 2.0  
Disallow: /

User-agent: fastbot crawler beta 4.0  
Disallow: /

User-agent: feedly social
Disallow: /

User-agent: Feedly/1.0  
Disallow: /

User-agent: FeedlyBot/1.0  
Disallow: /

User-agent: Feedspot  
Disallow: /

User-agent: Feedspotbot/1.0
Disallow: /

User-agent: Clickagy Intelligence Bot v2
Disallow: /

User-agent: classbot
Disallow: /

User-agent: CISPA Vulnerability Notification
Disallow: /

User-agent: CirrusExplorer/1.1
Disallow: /

User-agent: Checksem/Nutch-1.10
Disallow: /

User-agent: CatchBot/5.0
Disallow: /

User-agent: CatchBot/3.0
Disallow: /

User-agent: CatchBot/2.0
Disallow: /

User-agent: CatchBot/1.0
Disallow: /

User-agent: CamontSpider/1.0
Disallow: /

User-agent: Buzzbot/1.0
Disallow: /

User-agent: Buzzbot
Disallow: /

User-agent: BusinessSeek.biz_Spider
Disallow: /

User-agent: BUbiNG
Disallow: /

User-agent: 008/0.85
Disallow: /

User-agent: 008/0.83
Disallow: /

User-agent: 008/0.71
Disallow: /

User-agent: ^Nail
Disallow: /

User-agent: FyberSpider/1.3
Disallow: /

User-agent: findlinks/1.1.6-beta5
Disallow: /

User-agent: g2reader-bot/1.0
Disallow: /

User-agent: findlinks/1.1.6-beta6
Disallow: /

User-agent: findlinks/2.0
Disallow: /

User-agent: findlinks/2.0.1
Disallow: /

User-agent: findlinks/2.0.2
Disallow: /

User-agent: findlinks/2.0.4
Disallow: /

User-agent: findlinks/2.0.5
Disallow: /

User-agent: findlinks/2.0.9
Disallow: /

User-agent: findlinks/2.1
Disallow: /

User-agent: findlinks/2.1.5
Disallow: /

User-agent: findlinks/2.1.3
Disallow: /

User-agent: findlinks/2.2
Disallow: /

User-agent: findlinks/2.5
Disallow: /

User-agent: findlinks/2.6
Disallow: /

User-agent: FFbot/1.0
Disallow: /

User-agent: findlinks/1.0
Disallow: /

User-agent: findlinks/1.1.3-beta8
Disallow: /

User-agent: findlinks/1.1.3-beta9
Disallow: /

User-agent: findlinks/1.1.4-beta7
Disallow: /

User-agent: findlinks/1.1.6-beta1
Disallow: /

User-agent: findlinks/1.1.6-beta1 Yacy
Disallow: /

User-agent: findlinks/1.1.6-beta2
Disallow: /

User-agent: findlinks/1.1.6-beta3
Disallow: /

User-agent: findlinks/1.1.6-beta4
Disallow: /

User-agent: bixo
Disallow: /

User-agent: bixolabs/1.0
Disallow: /

User-agent: Crawlera/1.10.2
Disallow: /

User-agent: Dataprovider Site Explorer
Disallow: /

# Backlink Protector. 

User-agent: AhrefsBot
Disallow: /

User-agent: Alexibot
Disallow: /

User-agent: MJ12bot
Disallow: /

User-agent: SurveyBot
Disallow: /

User-agent: Xenu's
Disallow: /

User-agent: Xenu's Link Sleuth 1.1c
Disallow: /

User-agent: rogerbot
Disallow: /

# Loading Performance for Woocommerce

Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Disallow: /*?orderby=price
Disallow: /*?orderby=rating
Disallow: /*?orderby=date
Disallow: /*?orderby=price-desc
Disallow: /*?orderby=popularity
Disallow: /*?filter
Disallow: /*add-to-cart=*

# Avoid crawler traps causing crawl budget issues

Disallow: /search/
Disallow: *?s=*
Disallow: *?p=*
Disallow: *&p=*
Disallow: *&preview=*
Disallow: /search

# Social Media Crawling

User-agent: facebookexternalhit/1.0
Allow: /
User-agent: facebookexternalhit/1.1
Allow: /
User-agent: facebookplatform/1.0
Allow: /
User-agent: Facebot/1.0
Allow: /
User-agent: Visionutils/0.2
Allow: /
User-agent: datagnionbot
Allow: /

# Social Media Crawling

User-agent: Twitterbot
Allow: /

# Social Media Crawling

User-agent: LinkedInBot/1.0
Allow: /

# Social Media Crawling

User-agent: Pinterest/0.1
Allow: /
User-agent: Pinterest/0.2
Allow: /

# Allow/Disallow Ads.txt

Allow: /ads.txt

# Allow/Disallow App-ads.txt

Allow: /app-ads.txt

Crawl-delay: 5

Display Google Reviews on Website

How to Display Google Reviews on your Website – https://www.launch2success.com/guide/display-google-reviews-website-2017
Step 1. Copy This Code.


<div id="google-reviews"></div>

<link rel="stylesheet" href="https://cdn.rawgit.com/stevenmonson/googleReviews/master/google-places.css">
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.1.1/jquery.min.js"></script>
<script src="https://cdn.rawgit.com/stevenmonson/googleReviews/6e8f0d79/google-places.js"></script>
<script src="https://maps.googleapis.com/maps/api/js?v=3.exp&key=<!--- Google Console Map API --->&signed_in=true&libraries=places"></script>

<script>
jQuery(document).ready(function( $ ) {
   $("#google-reviews").googlePlaces({
        placeId: '' //Find placeID @: https://developers.google.com/places/place-id
      , render: ['reviews']
      , min_rating: 4
      , max_rows:4
   });
});
</script>

Replace “<!--- Google Console Map API --->” & “<!--- Google Place ID --->”

Step 2. Find Your Google Place ID
You can find your place ID by searching in the box below or going to https://developers.google.com/places/place-id

Can’t find your Google Places ID? Read This Article.

WordPress Ping List for Faster Indexing Of New Post

WordPress ping list are very useful to extend your blog reach to major search engines.
Insert in WordPress Setting > Writing Settings
Under Update Services
When you publish a new post, WordPress automatically notifies the following site update services.
Reference: https://www.shoutmeloud.com/wordpress-ping-list.html

http://blogsearch.google.com/ping/RPC2
http://bblog.com/ping.php
http://bitacoras.net/ping
http://blog.goo.ne.jp/XMLRPC
http://blogdb.jp/xmlrpc
http://blogmatcher.com/u.php
http://bulkfeeds.net/rpc
http://coreblog.org/ping/
http://mod-pubsub.org/kn_apps/blogchatt
http://www.lasermemory.com/lsrpc/
http://ping.blo.gs/
http://ping.bloggers.jp/rpc/
http://ping.feedburner.com
http://ping.rootblog.com/rpc.php
http://pingoat.com/goat/RPC2
http://rpc.blogbuzzmachine.com/RPC2
http://rpc.blogrolling.com/pinger/
http://rpc.pingomatic.com
http://rpc.weblogs.com/RPC2
http://topicexchange.com/RPC2
http://trackback.bakeinu.jp/bakeping.php
http://www.bitacoles.net/ping.php
http://www.blogoole.com/ping/
http://www.blogpeople.net/servlet/weblogUpdates
http://www.blogshares.com/rpc.php
http://www.blogsnow.com/ping
http://www.blogstreet.com/xrbin/xmlrpc.cgi
http://www.mod-pubsub.org/kn_apps/blogchatter/ping.php
http://www.newsisfree.com/RPCCloud
http://www.newsisfree.com/xmlrpctest.php
http://www.snipsnap.org/RPC2
http://www.weblogues.com/RPC/
http://xmlrpc.blogg.de

Redirect a single page

Redirect a URL for quick & easy way for search-engine friendly
Insert in .htaccess file
Reference: https://css-tricks.com/snippets/htaccess/301-redirects/

Redirect 301 /oldpage.html http://www.yoursite.com/newpage.html
Redirect 301 /oldpage2.html http://www.yoursite.com/folder/

Redirect an entire site

Does it with links intact. Your old site www.oldsite.com/existing/link.html will become www.newsite.com/existing/link.html. This is very helpful when you are just “moving” a site to a new domain.
Insert in .htaccess file
Reference: https://css-tricks.com/snippets/htaccess/301-redirects/

Redirect 301 / http://newsite.com/

Track Website Email Click / Website Phone Click

Access Google Analytics
Google Analytics Goals:
Step 1. Goal setup
• Custom
Step 2. Goal description
• Name: Email Click Event / Phone Click Event
Goal type: Event
Step 3. Goal details
• Category = button
• Action = click
• Label = email / phone
• Value = “blank”
Step 4. Click “Save”

Before Head just below Google Analytics Code/strong>
Note: Remember to change “info@yourdomain.com” in the code to your email address

Dropping Symbol

Apply before

Expert WordPress htaccess Setting

This htaccess setting is for force https with www, Enable HTTP Strict Transport Security (HSTS), Enable Gzip Compression, Website Content Security Policy, Increase Security with X-Security Headers, Enable Leverage browser caching of static assets.
Insert into .htaccess file

# Use HTTP Strict Transport Security to force client to use secure connections only
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" env=HTTPS
# Force HTTPS with www
RewriteCond %{HTTPS} !=on
RewriteRule ^(.*)$ https://%{HTTP_HOST}/$1 [R=301,L]
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule ^(.*)$ https://www.%{HTTP_HOST}/$1 [R=301,L,E=HTTPS:1]

# Compress HTML, CSS, JavaScript, Text, XML and fonts
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/vnd.ms-fontobject
AddOutputFilterByType DEFLATE application/x-font
AddOutputFilterByType DEFLATE application/x-font-opentype
AddOutputFilterByType DEFLATE application/x-font-otf
AddOutputFilterByType DEFLATE application/x-font-truetype
AddOutputFilterByType DEFLATE application/x-font-ttf
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE font/opentype
AddOutputFilterByType DEFLATE font/otf
AddOutputFilterByType DEFLATE font/ttf
AddOutputFilterByType DEFLATE image/svg+xml
AddOutputFilterByType DEFLATE image/x-icon
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/javascript
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/xml

# Remove browser bugs (only needed for really old browsers)
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
Header append Vary User-Agent

# Content Security Policy - Apply a CSP to all HTML and PHP files

Header set Content-Security-Policy "policy-definition"

# Extra Security Headers - X-XSS-Protection, X-Frame-Options, X-Content-Type nosniff

Header set X-XSS-Protection "1; mode=block"
Header always append X-Frame-Options SAMEORIGIN
Header set X-Content-Type-Options nosniff
Header set Referrer-Policy "no-referrer-when-downgrade"
Header set Feature-Policy "geolocation 'self'; vibrate 'none'"

# Leverage browser caching of static assets

ExpiresActive On

# Images
ExpiresByType image/jpeg "access plus 1 year"
ExpiresByType image/gif "access plus 1 year"
ExpiresByType image/png "access plus 1 year"
ExpiresByType image/webp "access plus 1 year"
ExpiresByType image/svg+xml "access plus 1 year"
ExpiresByType image/x-icon "access plus 1 year"

# Video
ExpiresByType video/mp4 "access plus 1 year"
ExpiresByType video/mpeg "access plus 1 year"

# CSS, JavaScript
ExpiresByType text/css "access plus 1 month"
ExpiresByType text/javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 month"

# Others
ExpiresByType application/pdf "access plus 1 month"
ExpiresByType application/x-shockwave-flash "access plus 1 month"


# BEGIN WordPress

RewriteEngine On
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]

# END WordPress

Import to Google Adwords
Adwords > Tools > Conversion actions > Google Analytics
Select to import:
Step 1. Category: Lead
Step 2. Value: Don’t assign a value
Step 3. Click “Import goals”

Block QTY Cart by Prevent Interaction

Add in CSS to block manual keying by user interaction to fill up the field box when there is “plus” and “minus” to increase or decrease the QTY.

.ux-quantity .input-text.qty.text {
    pointer-events: none;  /* Disable mouse interactions */
}

XML declaration allowed only at the start of the document

Showing space and cause sitemap_index.xml not able to display the sitemap
Reference: https://stackoverflow.com/questions/14685893/xml-declaration-allowed-only-at-the-start-of-the-document
Create a php file “whitespacefix.php” on the root directory

<?php
function ___wejns_wp_whitespace_fix($input) {
    $allowed = false;
    $found = false;
    foreach (headers_list() as $header) {
        if (preg_match("/^content-type:\\s+(text\\/|application\\/((xhtml|atom|rss)\\+xml|xml))/i", $header)) {
            $allowed = true;
        }
        if (preg_match("/^content-type:\\s+/i", $header)) {
            $found = true;
        }
    }
    if ($allowed || !$found) {
        return preg_replace("/\\A\\s*/m", "", $input);
    } else {
        return $input;
    }
}
ob_start("___wejns_wp_whitespace_fix");
?>

Then open the “index.php” file and add the following line right after <?php tag

include('whitespacefix.php');

YouTube parameters

Showing space and cause sitemap_index.xml not able to display the sitemap
Reference: https://freshysites.com/web-design-development/how-to-use-youtube-parameters-and-recent-changes/

&rel=0&controls=0&showinfo=0&modestbranding=1