Categories
Hacking

WordPress ico file hack, index.php hack

This hack is characterized by a few things:
1. Admin accounts you didn’t make showing up.
2. Roles you didn’t make showing up.
3. Malicious “ico” files showing up on the server with php code in them.
4. Random php files showing up across the site directories.
5. A lot of extra “index.php” file showing up across the site.
6. php and inc files that have been modified with obfuscated code.
7. Cross-site infections on a shared WHM. Infected files in the home/cpeasyapache directory.

Some are saying there are “back doors” made where the hackers are getting in. I highly doubted this after patching to 7.59 as the malicious admin accounts were not showing back up. What was happening is that the “ico” files were popping up every 5 – 10 hours, and these in turn were creating a bunch of random string php files and index.php files that pointed back to the ico file. This code seemed to be able to affect multiple sites sharing the same WHM.

Steps to fix the hack without trashing your sites:

Prep:
1. Backup your site and database, put site into maintenance mode.
2. Make sure you have SSH access to your WHM or Cpanel
3. Login to site and remove any malicious admin accounts.
4. Remove any roles you didn’t make.
5. From cpannel, using phpmyadmin, look at your database tables for users and roles, delete any tables you know shouldn’t be there. (user 0 is supposed to be there)
6. Change all database and user passwords.

Fix:
1. From SSH. run a query from the public_html folder like this:

find -name index.php

If you see a shitload of files come up on Drupal, you are hacked. There is only supposed to be 1 index.php.

2. Delete ALL of the index.php files with this command:

find -name index.php -exec rm -rf {} \;

We will replace the main index.php file when we update to the newest Drupal.

3. Search for icon files that don’t belong with this command:

find -name "*.ico"

If you see a bunch of strange icon files starting with a . and having a random string, they are php files. I just removed all icon files as I don’t give a shit about them with:

find -name "*.ico" -exec rm -rf {} \;

If you want to keep your icon files, just manually delete the suspicious ones with ftp.

I’m reading that there were also files other than .ico that were associated with this hack. An easy to way to identify which extension they are using is to open up one of your hacked index.php files and copy this line of text:

@include "\057h\157m\145/\147l\145n\143c\057p\165b\154i\143_\150t\155l\057p\162o\146i\154e\163/\164e\163t\151n\147/\0564\0621\142e\071a\067.\151c\157";

Paste it here: Unphp.net

It will tell you which malicious file it is pointed at. Use that information to remove all of the files of that type using the SSH command above but for the malicious extension instead of .ico.

4. Find the other malicious php files that don’t belong. On my site, they were all php files with 8 a-z character names. Like dkelfesa.php or something. Use this regex to look for 8 character php files:

find . -type f | egrep './[a-z]{8}\.php'

Remove any you suspect are malicious. Typically the name wont make any sense. If you open them and see a punch of php code you cant read, they are malicious.

5. After doing all of that, I was still getting hacked and couldn’t figure it out. I decided to open my database for the site in phpMyAdmin and run a search on all tables for:

<?php 

to look for any malicious php code I had missed. It turns out that when the original hack happened the hacker had used the admin account to created a block called “Development”. He gave it no title, and in the body inserted his malicious php code. He then enabled this on the site. He could remotely call it and it would reinfect the site with the ico files, the index.php files, and the random string php files. Run a database search for the php tag above and look at all results for malicious code across your site in content nodes or blocks. Remove any code you can’t easily read.

6. There were also quite a few php and inc files across the Drupal installation that had either been modified with malicious php at the top of the file, or just didn’t belong. They were cleverly named and stashed away into modules folders, library folders, everywhere. As I discovered them I created these grep patters to help others find them. Unfortunately I found 4 malicious files in the /home/cpeasyapache directory. These looked to have the ability to scan all sites on the shared host and infect them all. Use these searches at the WHM root level to look for malicious php and inc files. Remove or clean them.

find . -type f -name '*.php' | xargs grep -l " *=PHP_VERSION *" 
find . -type f -name '*.php' | xargs grep -l " *Phar::interceptFileFuncs() *"
find . -type f -name '*.php' | xargs grep -l " *@include *" 
find . -type f -name '*.php' | xargs grep -l " *interceptFileFuncs *"
find . -type f -name '*.php' | xargs grep -l " *eval *( *gzinflate *( *base64_decode *( *"
find . -type f -name '*.php' | xargs grep -l " *base64_decode *"
find . -type f -name '*.php' | xargs grep -l " *function *wscandir *"
find . -type f -name '*.php' | xargs grep -l " *HTTP/1.0 *404 *Not *Found *"
find . -type f -name '*.php' | xargs grep -l " *@gzuncompress *" 
find . -type f -name '*.php' | xargs grep -l " *Array *( *) *; *global *" 
find . -type f -name '*.php' | xargs grep -l " *@unserialize *" 

If you don’t have access to your root, contact your host provider to have them scan it for the above patterns to remove the malicious files.

Categories
Javascript Others

WA Group Contacts Exporter

https://chrome.google.com/webstore/detail/wa-group-contacts-exporte/logphdfbdkcnanagckdlfeiolbjboneh

If you want you can contact me directly

Thanks

 

Categories
Others

oxigen wallet clone php script

oxigen wallet clone php script

Link:http://midlep.com

Features:
You can signup and login.

Reset Pass via otp

Recharge via wallet

Money Transfter

Wallet Transfer

Bill Payment

Payback point credit system

and many more.

Payumoney Payment Gateway and rechpai recharge api

Laravel Based Admin Panel SuperCharged.

Price Rs.25000/-

Call me: +91-9038655955

Categories
Tech News

Web trends that lead in 2019

Web trends that lead in 2019

Artificial Intelligence

Artificial intelligence is one of the core components of any excellent digital transformation strategy. And as more companies are vying to implement digital transformation strategies, the role of artificial intelligence will grow even more central in SLC web development company than ever before.

best webdesign and web devolpment

Single Page Applications (SPAs)

SPA architecture

SPAs have caught on this year. And because they’re relatively easier to navigate, they look poised to grow in their popularity. With simplicity and speed at the heart of their design, they’re increasingly warming their way into the hearts of web users. As the name suggests, a single page application is a long webpage that’s rid of intricate navigation and complex menus. SPAs run perfectly on both desktop and mobile devices.

Blockchain Technology

Bitcoin and a host of other types of cryptocurrency are what dragged blockchain technology into the mainstream. When BTC hit the market, most traders switched from USD to ETH, as this currently had proven to be quite lucrative. Based on a peer-to-peer network of computers that stores data using a distributed ledger ensures data never gets compromised, blockchain technology is increasingly gaining wider use in various industries.

Blockchain technology stack

Blockchain technology can facilitate the instantaneous authentication of massive transactions, and the streamlined management of supply chains, to mention but a few of it’s emerging applications. Today, industry heavyweights including IBM, Microsoft and Amazon are all developing their own blockchain technology platforms.

Motion UI

Web users are increasingly becoming more inclined to simple, yet captivating and interactive graphics on web pages. Web development companies are coming to realize that the diminishing attention span of average internet users is calling for more captivating graphics on web pages. Static images and flash-based graphics are relegating many websites to web user’s back burners, while evocative motion UI is becoming all the rage these days.

The growing preference among web users for intuitive motion UI will drive the popularity of motion UI through the roof in the coming years.

More Enriched Designs

For a long time, the focus on functionality and simplicity has been a dominant trend in web design. However, there’s been an increasing shift towards the use of more images, logo animation and shapes in web design in a bid to keep up with the growing demands for more vibrant designs by web users.

Enriched design example

Graphic contents are increasingly being used in place of text contents. The growing demand for more engaging graphics by web users will continue to increase in 2019, and this will fuel the need for the use of more captivating graphics in web developments.

Push Notifications

Owing to the tremendous successes that they’ve brought innumerable mobile apps, push notifications have become critical components of most websites. Push notifications are fast replacing email newsletters because they’re easier to manage on both ends of the users and manufacturers.

Push notifications examples

They’re now being used in most websites for notifying users about newly published content, special offers, and personalized promotions. Their usability and accessibility will continue to drive their trendiness in 2019 and beyond.

Progressive Web Apps (PWAs)

Progressive web apps bring together the best of worlds of browsers, websites and mobile applications. A progressive web app is a website that functions almost like a native mobile app. Most brands that have made the switch from websites to progressive web apps have experienced significant increments in their conversion rates and user engagement. Since they convey the best of websites and mobile apps, they’re poised to remain a leading trend in 2019.

PWA architecture

Low Code Development

Low code development solution

By providing a faster and easier way to develop and deploy web developments, low code development facilitates the rapid development of excellent websites without the need for proficient coding expertise. Low code developments have been making serious waves as they’re increasingly being preferred by web development companies over the conventional development processes.

Adaptability

Responsive vs adaptive design

In order to reach a wider audience, most brands are rooting for websites that can be displayed perfectly on a wide range of smart devices. The trend of adaptability owes more to the fact that more web users are accessing the web through multiple mobile devices. As companies look to capture a greater proportion of audiences in their niches by making their website more responsive to a wider range of devices, adaptability is bound to continue as a key trend in web development in the coming year.

Cybersecurity

In a recent survey conducted by Alert Logic, the greatest concerns of IT professionals are data breaches, data privacy infringement and confidentiality breaches. There have been many similar statistics showing that most organizations are bracing up for an unprecedented wave of cyber attacks which will leave their working environments with more hazards. As a result, cybersecurity is poised to remain a dominant trend in web development in the nearest future.

Cybersecurity trends

 

What you think ?

Categories
Tech News

globalmarketingpvt.ltd global marketing pvt

i created website live at
http://globalmarketingpvt.ltd

if you like i will share frontend and backend details contact me via : info@chandandubey.com

 

i have 100% source code ready.

Categories
SEO

My points regarding seo

We recently analyzed 1 million Google search results to answer the question:

Which factors correlate with first page search engine rankings?

We looked at content. We looked at backlinks. We even looked at site speed.

With the help of Eric Van Buskirk and our data partners1, we uncovered some interesting findings.

And today I’m going to share what we found with you.

Here is a Summary of Our Key Findings:

1. Backlinks remain an extremely important Google ranking factor. We found the number of domains linking to a page correlated with rankings more than any other factor.

2. Our data also shows that a site’s overall link authority (as measured by Ahrefs Domain Rating) strongly correlates with higher rankings.

3. We discovered that content rated as “topically relevant” (via MarketMuse), significantly outperformed content that didn’t cover a topic in-depth. Therefore, publishing focused content that covers a single topic may help with rankings.

4. Based on SERP data from SEMRush, we found that longer content tends to rank higher in Google’s search results. The average Google first page result contains 1,890 words.

5. HTTPS had a reasonably strong correlation with first page Google rankings. This wasn’t surprising as Google has confirmed HTTPS as a ranking signal.

6. Despite the buzz around Schema, our data shows that use of Schema markup doesn’t correlate with higher rankings.

7. Content with at least one image significantly outperformed content without any images. However, we didn’t find that adding additional images influenced rankings.

8. We found a very small relationship between title tag keyword optimization and ranking. This correlation was significantly smaller than we expected, which may reflect Google’s move to Semantic Search.

9. Site speed matters. Based on data from Alexapages on fast-loading sites rank significantly higher than pages on slow-loading sites.

10. Despite Google’s many Penguin updates, exact match anchor text appears to have a strong influence on rankings.

11. Using data from SimilarWeb, we found that low bounce rate was associated with higher Google rankings.

We have detailed data and information of our findings below.

New Bonus Section: Get access to a free search engine ranking factors bonus section. This section includes a PDF checklist, a step-by-step case study, in-depth tutorials, and more. Click here to get access to the bonus section.

The Number of Referring Domains Has a Very Strong Influence on Rankings

You may have heard that getting backlinks from the same domain has diminishing returns.

In other words, it’s better to get 10 links from 10 different sites than 10 links from the same domain.

According to our analysis, this appears to be the case. We found that domain diversity has a substantial impact on rankings.

05_Number of Referring Domains_line

Google wants to see several different sites endorsing your page. And the more domains that link to you, the more endorsements you have in the eyes of Google.

In fact, the number of unique referring domains was the strongest correlation in our entire study.

Key Takeaway: Getting links from a diverse group of domains is extremely important for SEO. If you cannot get them, buy them from https://serpninja.io/pbn/ because these have to be acquired at any cost.

Authoritative Domains Tend to Rank Higher in Google’s Search Results

Not surprisingly, we found that a website’s overall link authority (measured using Ahrefs Domain Rating) was strongly tied to Google rankings:

08_Domain Link Authority (AHREFs Domain Rating)_line

In fact, a website’s overall authority had a stronger correlation to rankings than the authority of the page.

In other words, the domain that your page lives on is more important than the page itself.

Key Takeaway: Increasing the number of links to your site may improve rankings for other pages on your site.

Publishing Comprehensive, In-Depth Topical Content May Improve Rankings

In the early days of SEO, Google would determine a page’s topic by looking strictly at the keywords that appeared on the page.

If the keyword appeared on the page X number of times, Google would determine that the page was about that keyword. Today, thanks largely to the Hummingbird Algorithm, Google now understands the topic of every page.

For example, when you search for “who was the director of back to the future”…

google search for hummingbird

…Google doesn’t look for pages that contain the keyword “who was the director of Back to the Future”.

Instead, it understands the meaning of the question, and provides an answer:

google knowledge graph

As you might expect, this has a significant impact on how we optimize our content for SEO. In theory, Google should prefer content that covers a single topic in-depth.

But does the data agree with that assumption?

To find out we used MarketMuse to analyze 10,000 of the URLs from our data set for “Topical Authority”.

And we discovered that comprehensive content significantly outperformed shallow content.

07_Content Topic Authority (MarketMuse Data)_line

This is interesting. But how do you write content that Google considers comprehensive?

Let’s look at two examples from our data set to find out.

First, we have this article on the Daily Press about the Busch Gardens fun card:

example of page with low topical authority

This page has many of the traditional metrics that result in first page rankings. For example, the page uses the keyword in the title tag and the H1 tag. Also, the domain (Dailypress.com) is very authoritative (Ahrefs Domain Rating of 64).

However, this page ranks only #10 for the keyword: “Busch Gardens fun card”.

google ranking number 10 on first page

This low ranking is partly due to the fact the content on the page has a very low Topical Authority score.

On the flip side, we have this page about making Balinese satay sauce.

comprehensive topic content

This page provides a wealth of information on satay sauce. This piece of content covers the history of satay sauce in Indonesia, how the sauce is used, a recipe, and even provides nutrition facts.

Even though this page doesn’t use the term “Indonesian Satay Sauce” anywhere on the page, it ranks on the first page for that keyword:

google hummingbird ranking

Part of the explanation for that ranking is that this page has a high Topical Authority for the topic: “Indonesian Satay Sauce”.

Key Takeaway: Writing comprehensive, in-depth content can help you rank higher in Google.

Long-Form Ranks Higher in Google’s Search Results Than Short-Form Content

Does long-form content outperform short, 200-word blog posts?

We turned to our data set to find out.

After removing outliers from our data (pages that contained fewer than 51 words and more than 9999 words), we discovered that pages with longer content ranked significantly better than short content.

02_Content Total Word Count_line

In fact, the average word count of a Google first page result is 1,890 words.

Several other search engine ranking factors studies have found that longer content performed better in Google.

This correlation could be due to the fact that longer content generates significantly more social shares. Or it could be an inherent preference in Google for longer articles.

Another theory is that longer content boosts your page’s topical relevancy, which gives Google a deeper understanding of your content’s topic.

Also, long-form content’s ranking advantage could simply reflect site owners that care about publishing excellent content. This being a correlation study, it’s impossible for us to pinpoint why longer content performs so well in terms of search engine rankings.

However, when you combine our data with what’s already out there, it paints a clear picture that long-form content is best for SEO.

Key Takeaway: Long-form content ranks higher in Google’s search results than short-form content. The average word count of a Google first page result is 1,890 words.

HTTPS is Moderately Correlated with Higher Rankings

Last year Google called on webmasters to switch their sites over to secure HTTPS. They even called HTTPS a “ranking signal“.

What does our data say?

Although not a super-strong correlation, we did find that HTTPS correlated with higher rankings on Google’s first page.

Use of HTTPS_line

Does this mean you should make the switch to HTTPS today? Obviously, the decision is yours. But switching your site to HTTPS is a serious project that can cause serious technical headaches.

Before you make the plunge to HTTPS, check out these guidelines from Google.

Key Takeaway: Because the association between HTTPS and ranking wasn’t especially strong — and the fact that switching to HTTPS is a resource-intensive project — we don’t recommend switching to HTTPS solely for SEO. But if you’re launching a new site, you want to have HTTPS in place on day one.

There is No Correlation Between Schema Markup and Rankings

There’s been a lot of buzz about Schema markup and SEO.

The theory goes something like this:

Schema markup gives search engines a better understanding of what your content means. This deeper understanding will encourage them to show your site to more people.

For example, you can use the <name> structured data tag to let Google know that when you use the word “Star Wars”, you’re referring to the original movie title…not the franchise in general:

schema markup example 2

Or you can use Schema to show ratings for products on your ecommerce site:

schema star ratings

All of these things should help with your rankings. In fact, Google’s John Mueller hinted that they might use structured data as a ranking signal in the future.

However, according to our analysis, the presence of structured data had no relationship with Google rankings.

Presence of Schema Markup

Key Takeaway: Feel free to use structured data on your site. But don’t expect it to have an impact on your rankings.

Shorter URLs Tend to Rank Better than Long URLs

I typically recommended that people use short URLs for the sake of better on-page SEO.

Why?

There are two reasons:

First, a short URL like backlinko.com/my-post is easier for Google to understand than backlinko.com/1/12/2016/blog/category/this-is-the-title-of-my-blog-post.

In fact, according to Google’s Matt Cutts, after 5 words in your URL:

“[Google] algorithms typically will just weight those words less and just not give you as much credit.”

And our data supports the use of shorter URLs.

URL Length_line

Fortunately, this guideline is easy to put into practice. Whenever you publish a new piece of content, make the URL short and sweet.

If you use WordPress, you can set your permalink structure to “post name”:

wordpress URL permalinks

Then, whenever you write a post, modify the URL to include a few words:

changing the url

Quick word of warning: make sure the new permalinks only apply to future posts. If you change the permalinks for older posts it can cause serious SEO-related issues.

For example, the URL for my post: 21 Actionable SEO Techniques You Can Use Right Now is simply my target keyword:

google url

Second, a long URL tends to point to a page that’s several clicks from the homepage. That usually means that there’s less authority flowing to that page. Less authority means lower rankings.

For example, this URL to an iPad product page on BestBuy.com represents a page that’s far removed from the site’s authoritative homepage:

long url

Key Takeaway: Use short URLs whenever possible as they may give Google a better understanding about your page’s true topic.

Content With At Least One Image Ranks Higher Than Content That Lacks an Image
(But Using Lots of Images Doesn’t Make a Difference)

Industry studies have found that image-rich pages tend to generate more total views and social shares.

This suggests that including lots of images in your content can boost shares, which should therefore improve Google rankings.

To measure the impact of image use on rankings we looked at the presence or absence of an image in the body of the page (in other words, in the content of the page).

According to our data, using at least one image in your content is significantly better than having no image at all.

Content Contains At Least 1 Image_line

However, when we looked at the link between the total number of images and rankings, we didn’t find any correlation.

This suggests that there’s a point of diminishing returns when it comes to image usage and rankings.

Key Takeaway: Using a single image is clearly better than zero images. Including lots of images doesn’t seem to have an impact on search engine rankings.

Using An (Exact) Keyword in Your Page’s Title Tag Has a Small Correlation With Rankings

Since the early days of search engines the title tag has been (by far) the most important on-page SEO element.

Because your title tag gives people (and search engines) an overview of your page’s overall topic, the words that appear in your title tag have long had a significant impact on rankings.

However, we wanted to see whether or not Google’s move towards Semantic Search has made the title tag any less important.

We found that title tag keyword usage still slightly correlates with rankings. However, it had a much smaller relationship than we anticipated.

Keyword-Appears-in-Title-Tag-(Exact-Match)_line

This finding suggests that Google doesn’t need to see the exact keyword in your title tag to understand your page’s topic.

For example, here are the top six results for the keyword “list building”.

google top 6 results 1

Note how three of the top six results (including the #1 result) don’t contain the exact keyword “list building” in their title tag.

google top 6 results

This is a reflection of Google moving away from exact keyword usage to Semantic Search.

Key Takeaway: Including your target keyword in your title tag may help with rankings for that keyword. However, because of Semantic Search, the impact doesn’t appear to be nearly as great as it once was.

Pages On Fast-Loading Websites Rank Significantly Higher than Pages On Slow-Loading Websites

Since 2010, Google has used site speed as an official ranking signal.

But we were curious:

How much does site speed impact rankings?

We used Alexa’s domain speed to analyze the median load time of 1 million domains from our data set. In other words, we didn’t directly measure the loading speed of the individual pages in our data set. We simply looked at the average loading speed across the entire domain.

And we found a strong correlation between site speed and Google rankings:

Average Page Load Spead (for URL's domain)_line

Again, this is simply a correlation. Could it be that site owners that optimize for speed also optimize for SEO? Sure.

But having a fast-loading site certainty won’t hurt your SEO. So it makes sense to speed things up.

Key Takeaway: Fast-loading websites are significantly more likely to rank in Google.

More Total Backlinks = Higher Rankings

There’s been a lot of buzz about new ranking signals (like social signals) that search engines use today. Many have even gone on to say that backlinks are becoming less important.

We were curious to see whether or not Google still used the sheer number of backlinks as an algorithmic ranking signal.

To measure this, we used the Ahrefs API to determine the total number of backlinks pointing to each page in our data set.

We found that pages with the highest number total backlinks tended to rank best in Google.

13_Total-External-Backlinks_line

Even though Google continues to add diversity to its algorithm, it appears that backlinks remain a critical ranking signal.

Key Takeaway: Pages with more backlinks tend to rank higher than pages with fewer backlinks.

Google Rankings Are Closely Tied to a Page’s Overall Link Authority

In addition to total backlinks, we wanted to answer the question:

Does a page’s overall authority influence rankings?

Most SEOs agree that backlink quality is just as important as backlink quantity.

In other words, it’s typically better to get a single link from an authoritative page than 100 links from 100 low-quality pages.

And our data supports this:

Webpage Link Authority (Ahrefs URL Rating)_line

According to Ahrefs’s measure of link authority (URL Rating), authoritative pages outrank pages with little link authority. However, this correlation wasn’t as strong as the impact of the total amount of referring domains.

Key Takeaway: The overall link authority of your page matters.

Exact Match Anchor Text Significantly Correlates With Rankings

Since Google released its Penguin update in 2012, many SEO professionals have advised against building backlinks with exact match anchor text. However, several search engine ranking studies have found that anchor text is still important.

That’s why we wanted to investigate whether or not anchor text remained an important ranking signal.

Our research shows that exact match anchor text strongly correlates with rankings.

In the early days of SEO, building backlinks with exact match anchor text was a very effective approach. For example, if you wanted to rank for the keyword “online flower delivery” you would make sure your links had anchor text like this:

example of exact match anchor text

However, Google has likely cracked down on this practice, starting with the initial Penguin update. For that reason, we don’t recommend building links that use exact match anchor text, despite the fact that it appears to have a strong impact on rankings.

Key Takeaway: Backlinks with exact match anchor text robustly correlate with rankings. However, because of the risk in exact match anchor text links, we don’t advise utilizing exact match anchor text as an SEO tactic.

Low Bounce Rates Are Strongly Associated With Higher Google Rankings

Many people in the SEO world have speculated that Google uses “user experience signals” (like bounce rate, time on site and SERP click-through-rate) as ranking factors.

To test this theory, we pulled 100,000 websites from our data set and analyzed them in SimilarWeb.

Specifically, we analyzed three user experience signals: bounce rate, time on site and SERP CTR.

We discovered that websites with low average bounce rates are strongly correlated with higher rankings.

Bounce-Rate_line

Please keep in mind that we aren’t suggesting that low bounce rates cause higher rankings.

Google may use bounce rate as a ranking signal (although they have previously denied it). Or it may be the fact that high-quality content keeps people more engaged. Therefore lower bounce rate is a byproduct of high-quality content, which Google does measure.

As this is a correlation study, it’s impossible to determine from our data alone.

Key Takeaway: Google may use bounce rate as a ranking signal. Or it may be a case of a correlation not equaling causation.

Conclusion

Special thanks to our data partners: SEMRushAhrefsMarketMuse and SimilarWeb for making this study possible.

I also want to thank Eric Van Buskirk of ClickStream (Project Director), Zach Russell (Lead Developer), and Qi Zhao (Head Data Scientist) for their contributions.

Also, if you’d like to learn more about how we collected and analyzed our data, here is a link to our study methods.

And if you want help implementing these findings, then make sure to get access to the free search engine ranking factors bonus section.

Click the image below and enter your email to get access:

search engine ranking bonus section

Categories
Tech News

How to Kill a Process in Linux

In an operating system, there are many programs, which may be either run by an user or by the OS itself (such as system services). Such programs which are running on the system are called “processes”. Usually, a process terminates on its own when they’re done with their task, or when you ask them to quit by pressing a keyboard shortcut or clicking on the “Close” button.

However, sometimes a process can hang up or consume a lot of CPU or RAM. In this situation, you would want to manually “kill” the process. In this article, we will look at various tools you can use to kill processes on a Linux system.

Locating the process to kill

In order to kill a process, you should first locate the details of the process. You can do this through three commands — toppspidof and pgrep. Depending upon the situation, you can use one of these commands for this purpose.

As we will see later in this article, you can kill a process by its name or its process ID (PID). The PID is a number that uniquely identifies a process. Killing by the process ID is useful when you want to kill only a specific process. On the other hand, killing by the process name is useful when you want to kill all running instances of a particular program.

Locating the process with the top command

We will first look at the top command. Fire up the top command by typing:

top

You will get an interactive interface, as shown below. You can browse through this list to find the name or the PID of the process you want to kill.

The top command showing a list of processes.

To browse through this list, you can use the up/down keys. Additionally, the top command also has ways to filter processes by CPU usage, user and process names, which you can read about in this guide.

The leftmost column contains the PID of the process, and the right side contains the program name. As an example, in the above screenshot we have the vnstatd process running with a process ID of 263.

Locating the process with ps and grep commands

Another way to get a list of process is by running:

ps aux

In the above command, we have used the flags aux which have the following meanings:

  • a: Show processes for all users
  • u: Display the user who is using the process
  • x: Show all processes. (Without this, ps won’t show processes running in a GUI environment.)

The output of the command is similar to that of top. The PID is available in second column from the left, and the process name is available on the rightmost column.

The results of the "ps aux" command.

The advantage of using ps is that you can easily filter this list with the grep command. For example, to find a process associated with the term “vnstat”, you can use:

ps aux | grep -i vnstat

Filtering processes with ps and grep, with a positive result.

Here, we got two results — the vnstatd process, as well as the grep process. Since we were searching for all instances of the term “vnstat”, and we were also running grep with “vnstat” as its argument, we got grep  as well in the results.

Thus, even when there are no “vnstat” related processes running, we would get one entry showing the grep process:

Filtering processes with ps and grep.

So, even though we got a result, there are no processes that are of interest to us.

Finding the PID with pidof and pgrep

The top and ps/grep combination allows us to search for processes. On the other hand, if you know the exact name of a process, you can use pidof to find its PID.

Using pidof is pretty straightforward. To get the PIDs of a process with the exact name of “nginx”, use:

pidof nginx

If there are processes with the exact name of “nginx”, you will get a list of PIDs, as shown below. If there are none, you will get nothing as the output.

The pidof command

If you don’t know the full name, you can use pgrep instead of pidof. As an example, to search for all processes that contain “ngin” somewhere in their name, run:

pgrep ngin

This will match processes with the exact name of “nginx”, as well as any other process that matches the same criteria. For our system, notice that we get all the PIDs that belonged to “nginx” in the above screenshot.

The pgrep command.

The pidof andpkill commands give you far less information. As we shall see in the next section, there are some circumstances in which you can’t kill a process. The output of top and ps contain additional information that help you determine if you can really kill a process.

What processes can you kill?

Now that we have located the process, it is time to kill it. However, before we learn how to do so, there are a few things you need to know.

If you are a normal user, you can kill your own processes, but not those that belong to other users. Both top and ps  show the user under which a process is running. In the case of top, the second column contains the username. With ps aux, the first column contains the username.

However, a root user can kill all processes. You can either add sudo before any command to run it as root, or obtain a root shell by typing su, and then execute the command.

In Linux, when a process is killed, a “terminating signal” is delivered to the process. Although there are many different types of signals, we mostly deal with the “SIGTERM” and “SIGKILL” signals. They have a numeric value of 15 and 9 respectively. By default, all the process killing commands use “SIGTERM”, which allows the program to run some code before it exits, thus allowing it to terminate “gracefully”. If you want to terminate the process forcibly, you can use “SIGKILL” instead.

The Linux kernel maintains some information related to the state of a process. When a process terminates, the kernel must keep the information around, so that the parent process can find out if the child process was able to complete its tasks and whether it terminated on its own, or it was killed. Until the parent has done so, these “zombie” processes will appear in the list of processes. You can’t kill such a process because it’s just an entry in the list of all processes, and it doesn’t have an actual process associated with it.

When a process performs input/output operations (such as reading from or writing to disks), it is said to be in a state of “uninterruptible sleep”. You can’t kill a process while it is in this state.

You can tell if a process is in the “zombie”(Z) or “uninterruptible sleep”(D) state by looking at the 8th column of the top/ps output.

Killing a process

There are various commands you can use to kill a process — killkillallpkill and top. We will begin from the simplest one: the killall command.

Killing processes with the killall command

The killall command is one of the easiest ways to kill a process. If you know the exact name of a process, and you know that it’s not running as another user and it is not in the Z or D states, then you can use this command directly; there’s no need to manually locate the process as we described above.

By default,  For example, to kill a process named “firefox”, run:

killall firefox

To forcibly kill the process with SIGKILL, run:

killall -9 firefox

You can also use -SIGKILL instead of -9.

If you want to kill processes interactively, you can use -i like so:

killall -i firefox

If you want to kill a process running as a different user, you can use sudo:

sudo killall firefox

You can also kill a process that has been running for a certain period of time with the -o and -yflags. So, if you want to kill a process that has been running for more than 30 minutes, use:

killall -o 30m <process-name>

If you want to kill a process that has been running for less than 30 minutes, use:

killall -y 30m <process-name>

Similarly, use the following abbreviations for the respective units of time:

s seconds
m minutes
h hours
d days
w weeks
M months
y years

Killing processes with the pkill command

Sometimes, you only know part of a program’s name. Just like pgreppkill allows you to kill processes based on partial matches. For example, if you want to kill all processes containing the name apache in the name, run:

pkill apache

If you want to use a SIGKILL instead of a SIGTERM, use:

pkill -9 apache

Again, you can also use -SIGKILL instead of -9.

Killing processes with the kill command

Using the kill command is straightforward. Once you have found out the PID of the process that you want to kill, you can terminate it using the kill command. For example, if you want to kill a process having a PID of 1234, then use the following command:

kill 1234

As we mentioned previously, the default is to use a SIGTERM. To use a SIGKILL, use -9 or -SIGKILL as we have seen before:

kill -9 1234

Killing processes with the top command

It is very easy to kill processes using the top command. First, search for the process that you want to kill and note the PID. Then, press k while top is running (this is case sensitive). It will prompt you to enter the PID of the process that you want to kill.

After you enter the PID, press enter. Now it will ask which signal you want to use to kill the process. If you want to use SIGTERM(15), then simply press enter as it is the default signal. If you want to use SIGKILL(9), then type 9 and press enter.

If you leave the process ID blank and hit enter directly, it will terminate the topmost process in the list. You can scroll using the arrow keys, and change the process you want to kill in this way.

Conclusion

In this post, we saw the various ways to kill processes in Linux. Learning these commands is essential for proper system administration and management. If you want to explore more of those commands, have a look at their respective man pages.

If you liked this post, please share it ?

Categories
Post Formats

Smashing the Web By Improving Fonts` Technologies

Fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunteque porro quisqu dolorem ipsum quia dolo sit amet, consectetur, adipisci.

Donec interdum erat eget neque euismod consequat vitae eget tellus. Donec id libero id tellus fringilla congue eget vel orci. Integer tincidunt venenatis odio, in lacinia tellus semper sed. Integer sollicitudin, justo in consectetur malesuada, ligula purus blandit nisi.

Qrabitur fermentum lobortis ipsum pulvinar varius. Curabitur convallis porttitor viverra. Aenean ipsum est, porta ut tristique at, feugiat quis nulla. Vestibulum tellus nisi, fringilla ut nunc lacinia.

Pixel-perfect design specs

Fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunteque porro quisqu dolorem ipsum quia dolo sit amet, consectetur, adipisci erat eget neque euismod consequat vitae eget tellus. Donec id libero id tellus fringilla congue eget vel orci.

Donec interdum erat eget neque euismod consequat vitae eget tellus. Donec id libero id tellus fringilla congue eget vel orci. Integer tincidunt venenatis odio, in lacinia tellus semper sed.

Integer sollicitudin, justo in consectetur malesuada, ligula purus blandit nisi. Donec interdum erat eget neque euismod consequat vitae eget tellus. Donec id libero id tellus fringilla congue eget vel orci. Integer tincidunt venenatis odio, in lacinia tellus semper sed.

Donec interdum erat eget neque euismod consequat vitae eget tellus. Donec id libero id tellus fringilla congue eget vel orci. Integer tincidunt venenatis odio, in tellus semper sed.

Donec interdum erat eget neque euismod consequat vitae eget tellus. Donec id libero id tellus fringilla congue eget vel orci. Integer tincidunt venenatis odio, in lacinia tellus semper sed.

Donec interdum erat eget neque euismod consequat vitae eget tellus. Donec id libero id tellus fringilla congue eget vel orci. Integer tincidunt venenatis odio, in lacinia tellus semper sed.

Integer sollicitudin, justo in consectetur malesuada, ligula purus blandit nisi. Donec interdum erat eget neque euismod consequat vitae eget tellus. Donec id libero id tellus fringilla congue eget vel orci. Integer tincidunt venenatis odio, in lacinia tellus semper sed.

Categories
Tech News

Web Apps and Servers Using JavaScript Are Vulnerable to ReDos attacks

Web Applications and web servers using JavaScript are vulnerable to a specific type of attack known as a Regex Denial of Service (ReDoS). The attacker usually sends a large regular expression to a JavaScript web-based application, if the application is not designed to handle such cases the attacker could end up freezing the application whilst it sits there using numerous resources trying to analyse the pattern.

Why can ReDos do a lot of Damage to JS Web Servers?

The single threaded event-loop based model in JavaScript actually gives priority when compared to other programming languages since every request to the server is handled by a single thread. With the ReDoS attack this single thread ends up clogging the entire server with the request.

ReDoS attacks have been gaining momentum as most applications nowadays utilize JavaScript in some form or another, this issue went unnoticed for over half-a-decade. A Research paper published in 2017 showed that more than 5% of the total vulnerabilities noticed in NodeJs are ReDoS vulnerabilities. Latest results show that these ReDoS attacks are gaining momentum in the JavaSctipt community since it has been left unaddressed for so many years.

Two Researchers from the University of Darmstadt Germany named Cristian Alexandru Staicu and Michael Pradel found 25 previously unknown vulnerabilities in NodeJs modules.

The exploit packages may cause vulnerable systems to freeze for a number of minutes when the server tries to match the pattern in the regular expression in order to decide what to do with the sent payload.

How many libraries were affected?

The researchers performed a scan of 2,846 popular NodeJS libraries over 300 libraries were found to contain ReDoS vulnerabilities.

 

“ReDoS poses a serious threat to the availability of these sites,” the research team said. “Our results are a call-to-arms for developing techniques to detect and mitigate ReDoS vulnerabilities in JavaScript.