[Music] what's up everybody this is techg back with another video to help you successfully pass the CompTIA Tech Plus fc0 d71 certification exams so let's get into it in this video we're going to be talking about units of measure in Computing so for those studying for the CompTIA Tech Plus exam understanding these common units is crucial and in this video we're going to to break down the most widely used units of measure in Computing explain what they represent and then compare them to one another all right so let's start with the most fundamental units in Computing which are the bit and the bite so a bit is short for binary digit and it's the smallest unit of data in Computing it can have a value of either zero or one so think of it as the basic building block for All Digital Data a bite this is a collection of eight bits so one bite is typically to represent a single character like the letter A or a punctuation mark in fact most of the data we interact with on computers is measured in bytes rather than bits so for example a text file containing a short message might be around 100 bytes in size and it is important to remember that in some contexts data transfer speeds are measured in bits per second while data storage is generally measured in bytes all right now let's move on to the next levels of units and we have the kilobyte meab gigabyte terabyte and the petabyte so these units build upon each other and are used to measure larger quantities of data so the first one is the kilobyte and this is 1,24 bytes although in many cases it is rounded down to just 1,000 bytes for Simplicity so to put this into context a small text document might be around 10 kilobytes in size then we have what is called the megabyte and this is 1,24 kilobytes or roughly 1 million bytes and a standard MP3 song file might range from 3 to 5 mtes then we have the gigabyte and this is 1,24 megabytes or around 1 billion bytes most smartphones today have storage capacities ranging from 64 gbt to 512 gigabytes and a standard hourlong HD video could be approximately 1 to 2 gigabytes in size next we have what is called the terabyte and a terabyte is 1,24 gabes or roughly 1 trillion bytes and modern hard drives often have capacity starting at 1 tbte or higher and to give you a sense of size a 1 terab Drive can hold around 250,000 high quality photos or 250 ful length HD movies and then we have what is called the petabyte and this is 1,24 terabytes or around 1 quadrillion bytes and to put this in perspective one petabyte is enough to store around5 00 billion pages of standard printed text large organizations like data centers and cloud storage providers often deal with data at the petabyte level so for example social media platforms search engines or video streaming services can have data storage requirements in pedabytes given the immense amount of user generated content that they store now as you move up these units from kilobyte to petabyte the scale of data being measured increases significantly and understanding these units is crucial when discussing data storage capacity and requirements all right now let's discuss data transfer units which are essential for measuring the speed at which data is moved or transmitted and the first one we have is bits per second so as mentioned earlier bits are the smallest unit of data and when we talk about transfer speeds we often use bits per second so for example internet connection might be described as transferring data at 10 bits per second which is extremely slow by today's standards then we have kilobits per second a kilobit per second is 1,000 bits per second and historically D of internet connections were measured in kilobits per second like 56 kilobits per second then we have megabits per second a megabit per second is 1,000 kilobits per second or 1 million bits per second and most broadband internet speeds are measured in megabits per second today like 100 megabits or 200 megabits per second now a common misconception is to confuse megabits per second with megabytes per second so remember one byte equals 8 Bits so if your internet is 100 megabits per second your download speed is roughly 12.5 megabytes per second next we have what is called gigabits per second so a gigabit per second is 1,000 megabits per second or 1 billion bits per second a gigabit per second speeds are often associated with Fiber Optic internet connections and are standard for highspeed Network such as data center connections or Enterprise level networking so for example a network speed of 1 gabit per second is capable of transferring around 125 megabytes per second and then we have terabytes per second so a terabyte per second is equivalent to 8 terabits per second or 1,000 gigabytes per second terabytes per second speeds are extremely high and typically found in superc Computing environments data center interconnects or speci ized Network backbones and this speed is used for transferring extremely large volumes of data almost instantaneously often used in scenarios involving big data processing scientific research or highfrequency trading systems now understanding these units of data transfer is crucial especially when comparing the speed and efficiency of internet connections Network transfers or data communication across different platforms and remember that higher units like gigabits per second and terabytes per second signify extremely fast data transfer critical for high performance Computing and large scale data movement all right the next unit to cover is the Hertz and Hertz is used to measure frequency which in Computing often relates to clock speed or the speed of a processor so we have the standard Hertz so one Hertz equals 1 cycle per second in Computing this often refers to the number of Cycles a CPU can execute per second so for example a CPU running at one Hertz will be performing one operation per second which is extremely slow then we have what is called a kilohertz megahertz and the gigahertz so kertz this is 1,000 Hertz megahertz is 1 million Hertz gigahertz is 1 billion Hertz and modern processors are typically measured in gigahertz such as 2.5 gigahertz or 3.6 gigahertz indicating that they can perform billions of operations per second now the faster the clock speed measured in hertz the more operations a processor can perform per second which generally leads to faster performance so for traditional hard disk drives an important unit of measure is rpm which stands for revolutions per minute in revolutions per minute this unit measures how fast the platters inside a hard drive Spin and common RPM values for hard dis drives are 5,400 RPMs and 7200 RPMs with higher speeds offering faster read right performance now note that solid state drives do not have RP m is because they have no moving Parts Plus they're much faster than traditional hard dis drives now understanding RPM is important when comparing the performance of hard drives as it can significantly impact data read and write speeds all right finally let's cover Watts which measures power consumption in Computing devices Watts measures the rate of energy transfer and are used to describe how much power a device consumes devices like laptops might use 30 to 90 Watts while gaming desktops could use over 500 watts especially if they have powerful CPUs and gpus knowing the wattage is critical for power supply selection in computer systems and for understanding Energy Efficiency and understanding Watts helps and making informed choices about power requirements especially when building or upgrading a computer so in conclusion understanding units of measure is foundational for anyone studying for the CompTIA Tech Plus certification exam so here's a quick recap bit and byte are the building blocks of data larger units like kilobytes megabytes gigabytes and terabytes help measure data storage data transfer rates are measured in bits per second kilobits per second and megabits per second which is crucial for understanding communication speeds Hertz is used to measure frequency and processor speed such as gigahertz for CPUs RPM measures the speed of traditional hard drives which impacts their performance and Watts indicates the power consumption of computing devices by mastering these units you'll have a better understanding of how data is stored transferred processed and powered all right now it's time for our favorite part of the video the check on learning so the first question is which of the following units is most commonly used to measure the speed of a computer processor would it be a megabytes B Hertz C gigabytes or D watts and the correct answer would be Hertz so the speed of a computer processor is measured in hertz typically in gigahertz this indicates the number of Cycles a processor can complete per second megabytes and gigabytes are units of data storage while Watts measures power consumption next question if a storage device has a capacity of 1 terabyte how many gigabytes is it equivalent to would it be 100 gigabytes 512 gigabytes 1,24 gigabytes are 248 gab the correct answer is 1,24 GB so 1 terabyte is equivalent to 1,24 gabes the relationship between these units is based on the binary system using Computing where 1 terabyte equal 1,24 gigabytes and 1 gigabyte equal 1,24 megabytes and so on all right final question which unit of measurement is used to describe the speed of data transfer in a network would it be revolutions per minute megabits per second gigahertz or megabytes per second all right and the correct answer is megabits per second so Network speeds are often measured in megabits per second or gigabits per second representing the rate of data transfer megabytes per second is also a data transfer rate but less commonly used for Network speeds RPM is related to rotational speeds of devices like hard drives and gigahertz measures processor speed did