Thursday, 27 September 2018

What are Rootkits..?? || techtalksgroup ||


What Is a Rootkit?

A rootkit is a clandestine computer program designed to provide continued privileged access to a computer while actively hiding its presence. The term rootkit is a connection of the two words "root" and "kit." Originally, a rootkit was a collection of tools that enabled administrator-level access to a computer or network. Root refers to the Admin account on Unix and Linux systems, and kit refers to the software components that implement the tool. Today rootkits are generally associated with malware – such as Trojans, worms, viruses – that conceal their existence and actions from users and other system processes.

What Can a Rootkit Do?

A rootkit allows someone to maintain command and control over a computer without the computer user/owner knowing about it. Once a rootkit has been installed, the controller of the rootkit has the ability to remotely execute files and change system configurations on the host machine. A rootkit on an infected computer can also access log files and spy on the legitimate computer owner’s usage.

Rootkit Detection

It is difficult to detect rootkits. There are no commercial products available that can find and remove all known and unknown rootkits. There are various ways to look for a rootkit on an infected machine. Detection methods include behavioral-based methods (e.g., looking for strange behavior on a computer system), signature scanning and memory dump analysis. Often, the only option to remove a rootkit is to completely rebuild the compromised system.

Rootkit Protection

Many rootkits penetrate computer systems by piggybacking with software you trust or with a virus. You can safeguard your system from rootkits by ensuring it is kept patched against known vulnerabilities. This includes patches of your OS, applications and up-to-date virus definitions. Don't accept files or open email file attachments from unknown sources. Be careful when installing software and carefully read the end-user license agreements.

Static analysis can detect backdoors and other malicious insertions such as rootkits. Enterprise developers as well as IT departments buying ready-made software can scan their applications to detect threats including "special" and "hidden-credential" backdoors.

So that's it. Hope you guys like it. If yes then please .. comment down below and do not forget to like follow and share our social media platforms. 

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/
Google Plus:-https://plus.google.com/u/0/communiti…/117296242526461886479
Blog:- https://www.techtalksgroup.blogspot.com
Instagram:- https://www.instagram.com/theprogrammer.harshit

What is GodMode in Windows.? And how to activate it.. || techtalksgroup ||


GodMode is a special folder in Windows that gives you quick access to over 200 tools and settings that are normally tucked away in the Control Panel and other windows and menus.

Once enabled, God Mode lets you do all sorts of things, like quickly open the built-in disk defragmenter, view event logs, access Device Manager, add Bluetooth devices, format disk partitions, update drivers, open Task Manager, change display settings, adjust your mouse settings, show or hide file extensions, change font settings, rename the computer, and a lot more.

The way GodMode works is actually very simple: just name an empty folder on your computer as outlined below, and then instantly, the folder will turn into a super-handy place to change all sorts of Windows settings.

The steps for turning on God Mode is the exact same for Windows 10, Windows 8, and Windows 7:

Make a new folder, anywhere you like.

To do this, right-click or tap-and-hold on any empty space in any folder in Windows, and choose New > Folder.

Important: You need to make a new folder right now, not just use an existing folder that already has files and folders in it. If you proceed to Step 2 using a folder that already has data in it, all of those files will instantly become hidden, and while GodMode will work, your files will not be accessible.
When asked to name the folder, copy and paste this into that text box:

God Mode.{ED7BA470-8E54-465E-825C-99712043E01C}


Note: The beginning “God Mode” text is just a custom name that you can change to whatever you wish to help you identify the folder, but make sure the rest of the name is exactly the same as you see above.

The folder icon will change to a Control Panel icon and anything after your custom folder name will disappear.

Tip: Although we just warned in the previous step to use an empty folder to get to God Mode, there is a way to unhide your files and reverse GodMode if you accidentally did this to an existing folder. See the tip at the bottom of this page for help.

Double-click or double-tap the new folder to open GodMode.

So that's it. Hope you guys like it. If yes then please .. comment down below and do not forget to like follow and share our social media platforms. 

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/
Google Plus:-https://plus.google.com/u/0/communiti…/117296242526461886479
Blog:- https://www.techtalksgroup.blogspot.com
Instagram:- https://www.instagram.com/theprogrammer.harshit

Monday, 24 September 2018

New Zero-Day Vulnerability Found Effecting All Versions of Windows || techtalksgroup ||


A security researcher has publicly disclosed an unpatched zero-day vulnerability in all supported versions of Microsoft Windows operating system (including server editions) after the company failed to patch a responsibly disclosed bug within the 120-days deadline.

Discovered by Lucas Leong of the Trend Micro Security Research team, the zero-day vulnerability resides in Microsoft Jet Database Engine that could allow an attacker to remotely execute malicious code on any vulnerable Windows computer.

The Microsoft JET Database Engine, or simply JET (Joint Engine Technology), is a database engine integrated within several Microsoft products, including Microsoft Access and Visual Basic.

According to the an advisory released by Zero Day Initiative (ZDI), the vulnerability is due to a problem with the management of indexes in the Jet database engine that, if exploited successfully, can cause an out-out-bounds memory write, leading to remote code execution.
An attacker must convince a targeted user into opening a specially crafted JET database file in order to exploit this vulnerability and remotely execute malicious code on a targeted vulnerable Windows computer.
"Crafted data in a database file can trigger a write past the end of an allocated buffer. An attacker can leverage this vulnerability to execute code under the context of the current process," Trend Micro's Zero Day Initiative wrote in its blog post.
"Various applications use this database format. An attacker using this would be able to execute code at the level of the current process."
According to the ZDI researchers, the vulnerability exists in all supported Windows versions, including Windows 10, Windows 8.1, Windows 7, and Windows Server Edition 2008 to 2016.

ZDI reported the vulnerability to Microsoft on May 8, and the tech giant confirmed the bug on 14 May, but failed to patch the vulnerability and release an update within a 120-day (4 months) deadline, making ZDI go public with the vulnerability details.
Proof-of-concept exploit code for the vulnerability has also been published by the Trend Micro its GitHub page.Microsoft is working on a patch for the vulnerability, and since it was not included in September Patch Tuesday, you can expect the fix in Microsoft's October patch release.
Trend Micro recommends all affected users to "restrict interaction with the application to trusted files," as a mitigation until Microsoft comes up with a patch.

So that's it. Hope you guys like it. If yes then please .. comment down below and do not forget to like follow and share our social media platforms. 

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/
Google Plus:-https://plus.google.com/u/0/communiti…/117296242526461886479
Blog:- https://www.techtalksgroup.blogspot.com
Instagram:- https://www.instagram.com/theprogrammer.harshit

Monday, 17 September 2018

Chrome vs Chromium . || techtalksgroup ||


Chrome is a massively popular web browser that is developed and released by Google, and Chromium is a niche open-source browser that has far fewer users. However, Chrome and Chromium have a lot more similarities than differences. In fact, Chrome uses the same source code as Chromium, just with extra features that Google adds on top.

What is Chromium?

Chromium is an open-source web browser that's developed and maintained by the Chromium Project. Since it's open source, anyone is free to take and modify the source code as the please. However, only trusted members of the Chromium Project development community can actually contribute their own code.


Regular users are able to download a frequently updated version of Chromium, all compiled and ready to use, from download-chromium.appspot.com.

What is Chrome?

Chrome is a proprietary web browser that is developed, maintained, and released by Google. Since it's proprietary, you are free to download and use it, but you can't decompile, reverse engineer, or use the source code to build your own project.

Chrome is built on Chromium, which means that Google developers take the open-source Chromium source code and add their own proprietary code. For instance, Chrome has an automatic update feature, is capable of tracking your browsing data, and includes native support for Flash that Chromium lacks.

Chrome is available directly from Google.

The Biggest Differences Between Chromium and Chrome

Since both browsers are built on the same source code, there are two major differences between Chromium and Chrome: Chromium is updated far more frequently, and Google adds in a whole lot of extra stuff that you may or may not want.


Within those two broad categories, here are the seven most important specific examples where Chromium and Chrome are different from each other:


  • Chromium updates more frequently - Since Chromium is compiled directly from the Chromium Project source code, it changes constantly. Chrome has several release channels, but even the bleeding edge Canary channel updates less frequently than Chromium. If you want to get your hands on the absolute latest code that the Chromium Project has to offer, you need to use Chromium.
  • Chrome updates automatically - Chromium lacks an automatic update feature. So even though it updates more frequently, you need to update it manually. Since Chrome has an automatic update feature, it is capable of downloading and installing updates on its own. If you ever get too far out of date, it will even let you know.
  • Chrome tracks your web browsing - Chromium doesn't track your information, and Chrome does. If you don't want to provide Google with any information about your browsing habits on the internet, but you like Chrome, then Chromium may be an option.
  • Chrome locks you into the Google Play Store - By default, Chrome on Windows and Mac only lets you install extensions that you download from the Google Play Store, while Chromium allows outside extensions. If you want the same freedom in Chrome, you need to enable developer mode.
  • Chrome has native support for Adobe Flash - Flash isn't as widespread as it used to be, but there are still sites that don't work right if you don't have it. Since Flash isn't open source, Chromium doesn't support it natively. So if you want to use Flash in Chromium, and you aren't an expert, you may be in for a headache.
  • Chromium doesn't include closed-source media codecs - Chrome also includes licensed media codecs like AAC, H.264, and MP3 that Chromium doesn't. Without these codecs, media won't play in Chromium. So if you want to stream video on sites like Netflix and YouTube, you need to either use Chrome or install these codecs manually.
  • Chromium doesn't always have the security sandbox enabled by default - Both Chrome and Chromium have a security sandbox mode, but Chromium has it turned off by default in some cases.

Chromium vs. Chrome: Which One Wins?

Since Chromium and Chrome are so similar, and each one has benefits, it's difficult to say which one actually wins in a head to head fight. For most regular users, Chrome is the better choice, but for more advanced users, those who place an especially high value on privacy, and some Linux users, Chromium may be the way to go.

Who Should Use Chrome?

Anyone who wants to download a web browser and have it just work, right out of the box, should use Chrome instead of Chromium. This is especially true if you use either Windows or Mac.

Chrome is extremely easy to download and install, doesn't require any configuration, and you can use it to view movies and listen to music on the internet, and even view websites that use Flash, without a lot of extra headaches.

Who Should Use Chromium?

Chromium is a better choice for more advanced users who don't care about getting their hands a little dirty, and anyone who likes Chrome but doesn't want to be tracked by Google. It's also a viable choice for users of some Linux distributions that offer a modified version of Chromium that comes a lot closer to matching Chrome in terms of features.

So that's it. Hope you guys like it. If yes then please .. comment down below and do not forget to like follow and share our social media platforms. 

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/
Google Plus:-https://plus.google.com/u/0/communiti…/117296242526461886479
Blog:- https://www.techtalksgroup.blogspot.com
Instagram:- https://www.instagram.com/theprogrammer.harshit

Saturday, 15 September 2018

Pure Blood v2.0 - A Penetration Testing Framework Created For Hackers / Pentester / Bug Hunter ||techtalksgroup||


A Penetration Testing Framework created for Hackers / Pentester / Bug Hunter.

Web Pentest / Information Gathering:

  • Banner Grab
  • Whois
  • Traceroute
  • DNS Record
  • Reverse DNS Lookup
  • Zone Transfer Lookup
  • Port Scan
  • Admin Panel Scan
  • Subdomain Scan
  • CMS Identify
  • Reverse IP Lookup
  • Subnet Lookup
  • Extract Page Links
  • Directory Fuzz (NEW)
  • File Fuzz (NEW)
  • Shodan Search (NEW)
  • Shodan Host Lookup (NEW)

 Web Application Attack: (NEW)
  • Wordpress 
  • | WPScan 
  • | WPScan Bruteforce 
  • | Wordpress Plugin Vulnerability Checker 
Features: // I will add more soon. 
  • | WordPress Woocommerce - Directory Craversal 
  • | Wordpress Plugin Booking Calendar 3.0.0 - SQL Injection / Cross-Site Scripting 
  • | WordPress Plugin WP with Spritz 1.0 - Remote File Inclusion 
  • | WordPress Plugin Events Calendar - 'event_id' SQL Injection

Auto SQL Injection


Features:
  • | Union Based 
  • | (Error Output = False) Detection 
  • | Tested on 100+ Websites

Generator:

  • Deface Page
  • Password Generator // NEW
  • Text To Hash //NEW


Installation
git  clone  https://github.com/cr4shcod3/pureblood
cd pureblood
pip install -r requirements.txt

--------------------DOWNLOAD PUREBLOOD-----------------------

So that's it. Hope you guys like it. If yes then please .. comment down below and do not forget to like follow and share our social media platforms.

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/
Google Plus:-https://plus.google.com/u/0/communiti…/117296242526461886479
Blog:- https://www.techtalksgroup.blogspot.com
Instagram:- https://www.instagram.com/theprogrammer.harshit

Thursday, 13 September 2018

Top 5 Kali Linux Tools Every Hacker Should Know All About |techtalksgroup|


Top Kali Linux Tools:-
Now let’s get started with the list of my favorite tools and a lot of other hackers favorite tools too.

1. Metasploit:-

Metasploit is a framework for developing exploits, shellcodes, fuzzing tool, payloads etc. And it has a very vast collection of exploits and exploitation tools bundled into this single framework. It is available for all major Operating Sytems out there Windows, OS X, and Linux and comes pre-installed in Kali Linux. It is an offensive tool and to attack your own or your company’s infrastructure to check for security loopholes and to fix them before an actual attacker can break in.
It can also be used to target web applications, networks, and servers etc. You get both GUI and command line interface. There are to products for Metasploit a Free Community version and a paid Metasploit Pro.

2. Nmap (Network Mapper):-

Nmap is used to scan whole networks for open ports and for mapping networks and a lot more things. It is mainly used for scanning networks and discover the online PC’s and for security auditing. Most of the network admins use Nmap to discover online computer’s, open ports and manage services running. It uses raw IP packets in such a creative way to know what hosts are available on the network and what ports are open which services (applications name and version) are running on those systems.
It comes into version GUI and Command Line. Zenmap is the GUI version what I recommend is that first learn the command line and then move on to the GUI if you feel confident.

3. Armitage:-

Armitage is a graphical cyber attack management tool and it provides a GUI interface for all Metasploit features and makes it easier to understand and use. If you really want to understand and grow into the advanced features then Armitage is a great choice for you.

Armitage organizes Metasploit’s capabilities around the hacking process. There are features for discovery, access, post-exploitation, and maneuver.

And if you are working in a team then it can be a real help to share information with your team:
  • Use the same sessions.
  • Share victim hosts, capture data, download files etc.
  • Communicate using a shared event log.
  • Run bots to automate the tasks.

4. John The Ripper (JTR):-

John The Ripper is a very popular tool for password cracking it is also known as JTR and also it has the coolest name of all the tools. Mostly it is simply referred as ‘Jhon’ it is the most commonly used tool for password cracking and to perform dictionary attacks. John The Ripper takes text files, referred as a ‘wordlist’, which contains the list of commonly used passwords or real passwords cracked before, and it encrypts the password in the wordlist in the same way as the password which is being cracked. And then compare the output string with the encrypted string of the provided password.

This tool can be used to perform different types of dictionary attacks. If you are confused between Jhon The Ripper and THC Hydra then the most simple way to explain it is that THC Hydra is used to crack a password for online services and Jhon The Ripper is used for offline password cracking.

5. Wireshark:-

Wireshark is an open source tool for network analysis and profiling network traffic and packets and this kind of tools are referred to as Network Sniffers.

Wireshark, previously known as Ethereal, is used to monitor network traffic and analyze the packets that are sent out. Wireshark can intercept network traffic ranging from connection level information to bits of the information which make up a signal packet. All of this is done in real time and show to the user in a readable format. There are a lot of developments made in the tool (platform) over the years and it includes filters, color-coding the packets depending on their information and these features really help the penetration testers to dig deeper in the network traffic and inspect the packets in detail.

Note: If you are really interested in Network administration and penetration testing then knowing how to use Wireshark is a required skill. There are a lot of resources available online from where you can learn about using Wireshark in depth.

So that's it. Hope you guys like it. If yes then please .. comment down below and do not forget to like follow and share our social media platforms.

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/
Google Plus:-https://plus.google.com/u/0/communiti…/117296242526461886479
Blog:- https://www.techtalksgroup.blogspot.com
Instagram:- https://www.instagram.com/theprogrammer.harshit

Wednesday, 12 September 2018

Download The Free Kali Linux Book || tech talks group ||


Whether you are new to infosec, or a seasoned security veteran, the free "Kali Linux Revealed" online course has something to teach you. the saying "You can't build a great building on a weak foundation" rings true in the information security field as well , and if you use (or want to learn to use) kali in a professional way, you should familiarise yourself as best as you can with the internals of the penetration testing distribution - and that's what this training is all about - turning you into a Kali Linux professional user.

DOWNLOAD PDF - https://kali.training/downloads/Kali-Linux-Revealed-1st-edition.pdf

Learning how to master a kali gives you the freedom to create kali Linux recipes like the Kali ISO of Doom, or the kali Evil Ap. you'll be able to build optimize and custom kali kernels, host them on your own repositories and create your own custom Kali Appliances - and there's so much more.

After Reading This Book You Will Be Able To --

  • Use the Kali OS proficiently.
  • Automate, customize and pre-seed Kali Linux Installs.
  • Create kali appliances such as the Kali ISO of  Doom.
  • Build, modify and host kali packages and repositories.
  • Create, fork and modify simple kali packages.
  • Customize and rebuild your kernel.
  • Deploy Kali over the network.
  • Manage and orchestrate multiple installations of kali.
  • Build and customize Kali ARM images.
  • Create custom pentesting devices. 
So that's it. Hope you guys like it. If yes then please .. comment down below and do not forget to like follow and share our social media platforms. 

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/
Google Plus:- https://plus.google.com/u/0/communiti…/117296242526461886479
Blog:- https://www.techtalksgroup.blogspot.com
Instagram:- https://www.instagram.com/theprogrammer.harshit

Tuesday, 11 September 2018

FREE AND UNLIMITED FAST SPEED WITH -VPN HUB |TechTalksGroup|



VPN HUB - Free and unlimited fast speed on your mobile

UNBLOCK the Internet and Browse Securely with VPN HUB for Android. Get it Free on the Google Play Store.

                                                  LINK

                     https://www.vpnhub.com/

So that's it. Hope you guys like it. If yes then please .. comment down below and do not forget to like follow and share our social media platforms. 

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/
Google Plus:https://plus.google.com/u/0/communiti…/117296242526461886479
Blog:- https://www.techtalksgroup.blogspot.com
Instagram:- https://www.instagram.com/theprogrammer.har

Layers of OSI Model Explained .....|| tech talks group ||



The Open Systems Interconnection (OSI) model defines a networking framework to implement protocols in layers, with control passed from one layer to the next. It is primarily used today as a teaching tool. It conceptually divides computer network architecture into 7 layers in a logical progression. The lower layers deal with electrical signals, chunks of binary data, and routing of these data across networks. Higher levels cover network requests and responses, representation of data, and network protocols as seen from a user's point of view.

The OSI model was originally conceived as a standard architecture for building network systems and indeed, many popular network technologies today reflect the layered design of OSI.

1. Physical Layer 

At Layer 1, the Physical layer of the OSI model is responsible for ultimate transmission of digital data bits from the Physical layer of the sending (source) device over network communications media to the Physical layer of the receiving (destination) device. Examples of Layer 1 technologies include Ethernet cables and Token Ring networks. Additionally, hubs and other repeaters are standard network devices that function at the Physical layer, as are cable connectors.

At the Physical layer, data are transmitted using the type of signaling supported by the physical medium: electric voltages, radio frequencies, or pulses of infrared or ordinary light.

2.  Data Link Layer

When obtaining data from the Physical layer, the Data Link layer checks for physical transmission errors and packages bits into data "frames". The Data Link layer also manages physical addressing schemes such as MAC addresses for Ethernet networks, controlling access of any various network devices to the physical medium. Because the Data Link layer is the single most complex layer in the OSI model, it is often divided into two parts, the "Media Access Control" sublayer and the "Logical Link Control" sublayer.

3.  Network Layer

The Network layer adds the concept of routing above the Data Link layer. When data arrives at the Network layer, the source and destination addresses contained inside each frame are examined to determine if the data has reached its final destination. If the data has reached the final destination, this Layer 3 formats the data into packets delivered up to the Transport layer. Otherwise, the Network layer updates the destination address and pushes the frame back down to the lower layers.

4.  Transport Layer

The Transport Layer delivers data across network connections. TCP is the most common example of a Transport Layer 4 network protocol. Different transport protocols may support a range of optional capabilities including error recovery, flow control, and support for re-transmission.

5.  Session Layer

The Session Layer manages the sequence and flow of events that initiate and tear down network connections. At Layer 5, it is built to support multiple types of connections that can be created dynamically and run over individual networks.

6.  Presentation Layer

The Presentation layer is the simplest in function of any piece of the OSI model. At Layer 6, it handles syntax processing of message data such as format conversions and encryption / decryption needed to support the Application layer above it.

7.  Application Layer

The Application layer supplies network services to end-user applications. Network services are typically protocols that work with user's data. For example, in a Web browser application, the Application layer protocol HTTP packages the data needed to send and receive Web page content. This Layer 7 provides data to (and obtains data from) the Presentation layer.


So thats it. Hope you guys like it. If yes then please .. comment down below and do not forgot to like follow and share our social media platforms. 

Facebook Page:- https://www.facebook.com/theprogrammer.harshit/

Monday, 10 September 2018

UK’s Critical Infrastructure Vulnerable To DDoS Attacks ||tech talks group||


According to data revealed under the Freedom of Information Act by Corero Network Security, over one-third of critical infrastructure organizations in the UK are vulnerable to DDoS attacks. As per Corero, 39 percent of companies have ignored the risk of attacks on their network, leaving themselves vulnerable to data breaches, malware, and ransomware.

In a statement issued today, Sean Newman, director of product management at Corero, comments: “Cyber-attacks against national infrastructure have the potential to inflict significant, real-life disruption and prevent access to critical services that are vital to the functioning of our economy and society. These findings suggest that many such organizations are not as cyber resilient as they should be, in the face of growing and sophisticated cyber threats.”

Newman adds, “By not detecting and investigating these short, surgical, DDoS attacks on their networks, infrastructure organizations could also be leaving their doors wide-open for malware or ransomware attacks, data theft or more serious cyber attacks.”

Under the UK government’s proposals to implement the EU’s Network and Information Systems (NIS) directive, these organizations could be liable for fines of up to £17 million, or four percent of global turnover.

David Emm, the principal security researcher at Kaspersky Lab said, “The world isn’t ready for cyber-threats against critical infrastructure – but criminals are clearly ready and able to launch attacks on these facilities. We’ve seen attempts on power grids, oil refineries, steel plants, financial infrastructure, seaports and hospitals – and these are cases where organizations have spotted attacks and acknowledged them. However, many more companies do neither, and the lack of reporting these incidents hampers risk assessment and response to the threat.”

Edgard Capdevielle, CEO of Nozomi Networks, also commented: “This report emphasizes the impact of DDoS attacks and how they are often used as a cover to distract security teams while infecting systems with malware or stealing data. Such initiatives are often the first step in “low and slow”. He further added that “In light of this information, CNI organizations should give a high priority to re-assessing their cyber-security programs, evaluate where they are in relation to government recommendations, and inform themselves about current technologies available for protection….The right approach is to both shore up defenses and be able to quickly respond when attacks do occur.”

Targeting CNI, Eldon Sprickerhoff, founder and chief security strategist at entire said, “Although cyber-security regulations will require significant effort for the companies that are affected, this new legislation by the UK government demonstrates that they understand the severity of cyber-threats in today’s digital world and the destruction they can cause, if undeterred. Even if you’re not a CNI, cyber-threats should concern you. With cyber-criminals constantly adjusting their tactics, it is imperative that companies never stop defending themselves by constantly improving and expanding their cyber-security practices. Managed detection and response and incident response planning are common ways companies can stay ahead of their attackers.”


Here are five tips to help you can stay ahead of cybercriminals: 
  • Encryption – store sensitive data that is only readable with a digital key
  • Integrity checks – regularly check for any changes to system files
  • Network monitoring – use tools to help you detect for suspicious behavior
  • Penetration testing – conduct controlled cyber-attacks on systems to test their defenses and identify vulnerabilities
  • Education – train your employees in cyber-security awareness and tightly manage access to any confidential information


 That's it. Hope you guys like it. If yes then please .. comment down below and to not forget to like follow and share our social media platforms. 

Thursday, 6 September 2018

Everything you must know about RFC or Internet Requests for Comments || tech talks group ||


Request for Comments documents has been used by the Internet community for more than 40 years as a way to define new standards and share technical information. Researchers from universities and corporations publish these documents to offer best practices and solicit feedback on Internet technologies. RFCs are managed today by a worldwide organization called the Internet Engineering Task Force.



The very first RFCs including RFC 1 were published in 1969. Although the "host software" technology discussed in RFC 1 has long since become obsolete, documents like this one offer an interesting glimpse into the early days of computer networking. Even today, the plain-text format of the RFC remains essentially the same as it has since the beginning.

Many popular computer networking technologies in their early stages of development have been documented in RFCs over the years including

Even though the basic technologies of the Internet have matured, the RFC process continues running through the IETF. Documents are drafted and progress through several stages of review before final ratification. The topics covered in RFCs are intended for highly-specialized professional and academic research audiences. Rather than Facebook-style public comment postings, comments on RFC documents are instead given through the RFC Editor site. Final standards are published at the master RFC Index.

Do Non-Engineers Need to Worry About RFCs?

Because the IETF is staffed with professional engineers, and because it tends to move very slowly, the average Internet user doesn't need to focus on reading RFCs. These standards documents are intended to support the underlying infrastructure of the Internet; unless you're a programmer dabbling in networking technologies, you're likely to never need to read them or even be familiar with their content.


However, the fact that the world's network engineers do adhere to RFC standards means that the technologies we take for granted -- Web browsing, sending and receiving email, using domain names -- are global, interoperable and seamless for consumers.

So thats it. Hope you guys like it. If yes then please .. comment down below and do not forgot to like follow and share our social media platforms. 

Wednesday, 5 September 2018

Everything you must know about the History of Linux Operating System..||tech talks group||




Linux is an operating system used to power pretty much any device you can think of.

Linux Overview

When most people think of Linux they think of a desktop operating system used by geeks and techies or a server-based operating system used to power websites.
Linux is everywhere. It is the engine behind most smart devices. The Android phone that you are using runs a Linux kernel, that smart fridge that can restock itself runs Linux. There are smart lightbulbs that can talk to each other all with the help of Linux. Even rifles used by the army-run Linux.
A modern buzz term is "the internet of things". The truth is that there really is only one operating system that powers the internet of things and that is Linux.
From a business point of view, Linux is also used on large supercomputers and it is used to run the New York Stock Exchange.​​
Linux can also, of course, be used as the desktop operating system on your netbook, laptop or desktop computer.

Operating Systems

The operating system is special software used to interact with the hardware within a computer.
If you consider a standard laptop the hardware devices that the operating system has to manage includes the CPU, the memory, the graphics processing unit, a hard drive, a keyboard, mouse, screen, USB ports, wireless network card, ethernet card, battery, backlight for a screen and USB ports.
In addition to the internal hardware, the operating system also needs to be able to interact with external devices such as printers, scanners, joypads and a wide array of USB powered devices.
The operating system has to manage all the software on the computer, making sure each application has enough memory to perform, switching processes between being active and inactive.
The operating system has to accept input from the keyboard and act upon the input to perform the wishes of the user.
Examples of operating systems include Microsoft Windows, Unix, Linux, BSD, and OSX.

Overview of GNU/Linux

A term you might hear every now and then is GNU/Linux. What is GNU/Linux and how does it differ from normal Linux?
From a desktop Linux user point of view, there is no difference.
Linux is the main engine that interacts with your computer's hardware. It is commonly known as the Linux kernel.
The GNU tools provide a method of interacting with the Linux kernel.

GNU Tools

Before providing a list of tools lets look at the sort of tools you will need to be able to interact with the Linux kernel.
First of all at the very basic level before even considering the concept of a desktop environment you will need a terminal and the terminal must accept commands which the Linux operating system will use to perform tasks.
The common shell used to interact with Linux in a terminal is a GNU tool called BASH. To get BASH onto the computer in the first place it needs to be compiled so you also need a compiler and an assembler which are also GNU tools.
In fact, GNU is responsible for a whole chain of tools which make it possible to develop programs and applications for Linux.
One of the most popular desktop environments is called GNOME which stands for GNU Network Object Model Environment. Snappy isn't it.
The most popular graphics editor is called GIMP which stands for GNU Image Manipulation Program.
The people behind the GNU project sometimes get annoyed that Linux gets all the credit when it is their tools that power it.
My view is that everyone knows who makes the engine in a Ferrari, nobody really knows who makes the leather seats, the audio player, the pedals, the door trims and every other part of the car but they are all as equally important.

The Layers That Make Up A Standard Linux Desktop

The lowest component of a computer is the hardware.
On top of the hardware sits the Linux kernel.
The Linux kernel itself has multiple levels.
At the bottom sit the device drivers and security modules used to interact with the hardware.
On the next level, you have process schedulers and memory management used for managing the programs that run on the system.
Finally, at the top, there are a series of system calls which provide methods for interacting with the Linux kernel.
Above the Linux kernel are a series of libraries which programs can use to interact with the Linux system calls.
Just below the surface are the various low-level components such as the windowing system, logging systems, and networking.
Finally, you get to the top and that is where the desktop environment and desktop applications sit.

A Desktop Environment

A desktop environment is a series of graphical tools and applications which make it easier for you to interact with your computer and basically get stuff done.
A desktop environment in its simplest form can just include a window manager and a panel. There are many levels of sophistication between the simplest and fully featured desktop environments.
For instance, the lightweight LXDE desktop environment includes a file manager, session editor, panels, launchers, a window manager, image viewer, text editor, terminal, archiving tool, network manager and music player.
The GNOME desktop environment includes all of that plus an office suite, web browser, GNOME-boxes, email client and many more applications.

So thats it. Hope you guys like it. If yes then please .. comment down below and do not forgot to like follow and share our social media platforms. 

Sunday, 2 September 2018

What is Network Application Programming Interface (Network APIs)..? || tech talks group ||

An Application Programming Interface (API) lets computer programmers access the functionality of published software modules and services. An API defines data structures and subroutine calls that can be used to extend existing applications with new features, and build entirely new applications on top of other software components. Some of these APIs specifically support network programming.

Network programming is a type of software development for applications that connect and communicate over computer networks including the Internet. Network APIs provide entry points to protocols and re-usable software libraries. Network APIs support Web browsers, Web databases, and many mobile apps. They are widely supported across many different programming languages and operating systems.



Socket Programming

Traditional network programming followed a client-server model. The primary APIs used for client-server networking were implemented in socket libraries built into operating systems. Berkeley sockets and Windows Sockets (Winsock) APIs were the two primary standards for socket programming for many years.

Remote Procedure Calls

RPC APIs extend basic network programming techniques by adding the capability for applications to invoke functions on remote devices instead of just sending messages to them. With the explosion of growth on the World Wide Web (WWW), XML-RPC emerged as one popular mechanism for RPC.

Simple Object Access Protocol (SOAP)

SOAP was developed in the late 1990s as a network protocol using XML as its message format and HyperText Transfer Protocol (HTTP) as its transport. SOAP generated a loyal following of Web services programmers and became widely used for enterprise applications.

Representational State Transfer (REST)

REST is another programming model that also supports Web services that arrived on the scene more recently. Like SOAP, REST APIs use HTTP, but instead of XML, REST applications often choose to use a Javascript Object Notation (JSON) instead. REST and SOAP differ greatly in their approaches to state management and security, both key considerations for network programmers. Mobile apps may or may not utilize network APIs, but ones that do often use REST.

The Future of APIs

Both SOAP and REST continue to be actively used for development of new Web services. Being a much newer technology than SOAP, REST is more likely to evolve and produce other offshoots of API development.

Operating systems have also evolved to support the many new Network API technologies. In modern operating systems like Windows 10, for example, sockets continue to be a core API, with HTTP and other additional support layered on top for RESTful style network programming.

As is often the case in computer fields, newer technologies tend to roll out much faster than old ones become obsolete. Look for interesting new API developments to happen especially in the areas of cloud computing and Internet of Things (IoT), where the characteristics of devices and their usage models is quite different from traditional network programming environments.

So thats it. Hope you guys like it. If yes then please .. comment down below and do not forgot to like follow and share our social media platforms. 

How to run Windows Applications on Linux using Wine..? || tech talks group ||




The goal of the Wine project is to develop a "translation layer" for Linux and other POSIX compatible operating systems that enables users to run native Microsoft Windows applications on those operating systems.

This translation layer is a software package that "emulates" the Microsoft Windows API (Application Programming Interface), but the developers emphasize that it is not an emulator in the sense that it adds an extra software layer on top of the native operating system, which would add memory and computation overhead and negatively affect performance.

Instead, Wine provides alternative DDLs (Dynamic Link Libraries) that are needed to run the applications. These are native software components that, depending on their implementation, can be just as efficient or more efficient than their Windows counterparts. That is why some MS Windows applications run faster on Linux than on Windows.

The Wine development team has made significant progress towards achieving the goal to enable users to run Windows programs on Linux. One way to measure that progress is to count the number of programs that have been tested. The Wine Application Database currently contains more than 8500 entries. Not all of them work perfectly, but most commonly used Windows Applications run quite well, such as the following software packages and games: Microsoft Office 97, 2000, 2003, and XP, Microsoft Outlook, Microsoft Internet Explorer, Microsoft Project, Microsoft Visio, Adobe Photoshop, Quicken, Quicktime, iTunes, Windows Media Player 6.4, Lotus Notes 5.0 and 6.5.1, Silkroad Online 1.x, Half-Life 2 Retail, Half-Life Counter-Strike 1.6, and Battlefield 1942 1.6.

After installing Wine, Windows applications can be installed by placing the CD in the CD drive, opening a shell window, navigating to the CD directory containing the installation executable, and entering "wine setup.exe", if setup.exe is the installation program.

When executing programs in Wine, the user can choose between the "desktop-in-a-box" mode and mixable windows. Wine supports both DirectX and OpenGL games. Support for Direct3D is limited. There is also a Wine API that allows programmers to write software that runs is source and binary compatible with Win32 code.

The project was started in 1993 with to objective to run Windows 3.1 programs on Linux. Subsequently, versions for other Unix operating systems have been developed. The original coordinator of the project, Bob Amstadt, handed the project over to Alexandre Julliard a year later. Alexandre has been leading the development efforts ever since.

So thats it. Hope you guys like it. If yes then please .. comment down below and do not forgot to like follow and share our social media platforms.