Minitar MWGUHA Driver Download for Windows Vista

Minitar MWGUHA

Features: High Gain 5dBi Deattachable Antenna, support WMMTM (WI-FI Multimedia) Qos function, 64/128/256-bit WEP and WPA/WAP2 data encryption security, support Windows 98SE,Me,2000,XP,XP x64,MAC and Linux, suitable for notebook or desktop PC, support USB 2.0/1.1 interface, support Software AP Bridge function, Support NDS and PSP X-Link and support Roaming function.






Specifications


Click this to Download Minitar MWGUHA Driver for Windows Vista (including Windows 98,ME,2000), Linux and Mac.
Read more...

TP-Link TL-WN322G driver download for Windows Vista/XP/2K/ME/98

The TP-LINK TL-WN322G Wireless USB Adapter gives you the flexibility to install the PC or notebook PC in the most convenient location available, without the cost of running network cables.

Its auto-sensing capability allows high packet transfer up to 54Mbps for maximum throughput, or dynamic range shifting to lower speeds due to distance or operating limitations in an environment with a lot of electromagnetic interference. It can also interoperate with all the 11Mbps wireless (802.11b) products. Your wireless communications are protected by up to 256-bit encryption, so your data stays secure.

Additionally, TL-WN322G supports Soft AP, which supports PSP connection, brings you joyful Online-gaming experience.

TP-Link TL-WN322G is a USB Wireless kit uses for desktop or notebook PC. Supports both 802.11b and 802.11g, 54/48/36/24/18/12/9/6/11/5.5/2/1Mbps wireless LAN data transfer rate, Ad-hoc and Infrastructure modes, and supports roaming between access points.

Click this to download TP-Link TL-WN322G Driver for Windows Vista / XP / 2000 / Me / 98

Read more...

Wireless Basic: Network Architecture

The architecture of a network defines the protocols and components necessary to satisfy application requirements. One popular standard for illustrating the architecture is the seven-layer Open System Interconnect (OSI) Reference Model, developed by the International Standards Organization (ISO). OSI specifies a complete set of network functions, grouped into layers, which reside within each network component. The OSI Reference Model is also a handy model for representing the various standards and interoperability of a wireless network.





The OSI layers provide the following network functionality:
  • Layer 7—Application layer: Establishes communications among users and provides basic communications services such as file transfer and e-mail. Examples of software that runs at this layer include Simple Mail Transfer Protocol (SMTP), HyperText Transfer Protocol (HTTP) and File Transfer Protocol (FTP).
  • Layer 6—Presentation layer: Negotiates data transfer syntax for the application layer and performs translations between different data formats, if necessary. For example, this layer can translate the coding that represents the data when communicating with a remote system made by a different vendor.
  • Layer 5—Session layer: Establishes, manages, and terminates sessions between applications. Wireless middleware and access controllers provide this form of connectivity over wireless networks. If the wireless network encounters interference, the session layer functions will suspend communications until the interference goes away.
  • Layer 4—Transport layer: Provides mechanisms for the establishment, maintenance, and orderly termination of virtual circuits, while shielding the higher layers from the network implementation details. In general, these circuits are connections made between network applications from one end of the communications circuit to another (such as between the web browser on a laptop to a web page on a server). Protocols such as Transmission Control Protocol (TCP) operate at this layer.
  • Layer 3—Network layer: Provides the routing of packets though a network from source to destination. This routing ensures that data packets are sent in a direction that leads to a particular destination. Protocols such as Internet Protocol (IP) operate at this layer.
  • Layer 2—Data link layer: Ensures medium access, as well as synchronization and error control between two entities. With wireless networks, this often involves coordination of access to the common air medium and recovery from errors that might occur in the data as it propagates from source to destination. Most wireless network types have a common method of performing data link layer functions independent of the actual means of transmission.
  • Layer 1—Physical layer: Provides the actual transmission of information through the medium. Physical layers include radio waves and infrared light.

The combined layers of a network architecture define the functionality of a wireless network, but wireless networks directly implement only the lower layers of the model. A wireless NIC, for example, implements the data link layer and physical layer functions. Other elements of the network (such as wireless middleware), however, offer functions that the session layer implements. In some cases, the addition of a wireless network might impact only the lower layers, but attention to higher layers is necessary to ensure that applications operate effectively in the presence of wireless network impairments.

Each layer of the OSI model supports the layers above it. In fact, the lower layers often appear transparent to the layers above. For example, TCP operating at the transport layer establishes connections with applications at a distant host computer, without awareness that lower layers are taking care of synchronization and signaling.

As shown in figure above, protocols at each layer communicate across the network to the respective peer layer. The actual transmission of data, however, occurs at the physical layer. As a result, the architecture allows for a layering process where a particular layer embeds its protocol information into frames that are placed within frames at lower layers. The frame that is sent by the physical layer actually contains frames from all higher layers. At the destination, each layer passes applicable frames to higher layers to facilitate the protocol between peer layers.




*) Jim Geier
Read more...

About Cache

Cache is a special area of memory, managed by a cache controller, that improves performance by storing the contents of frequently accessed memory locations and their addresses. A memory cache and a disk cache are not the same. A memory cache is implemented in hardware and speeds up access to memory. A disk cache is software that improves hard-disk performance. When the processor references a memory address, the cache checks to see if it holds that address. If it does, the information is passed directly to the processor, so RAM access is not necessary. A cache can speed up operations in a computer whose RAM access is slow compared with its processor speed, because cache memory is always faster than normal RAM.

There are several types of caches:
Direct-mapped cache: A location in the cache corresponds to several specific locations in memory, so when the processor calls for certain data, the cache can locate it quickly. However, since several blocks in RAM correspond to that same location in the cache, the cache may spend its time refreshing itself and calling main memory.
Fully associative cache: Information from RAM may be placed in any free blocks in the cache so that the most recently accessed data is usually present; however, the search to find that information may be slow because the cache has to index the data in order to find it.
Set-associative cache: Information from RAM is kept in sets, and these sets may have multiple locations, each holding a block of data; each block may be in any of the sets, but it will only be in one location within that set. Search time is shortened, and frequently used data are less likely to be overwritten. A set-associative cache may use two, four, or eight sets.




Disk Cache

An area of computer memory where data is temporarily stored on its way to or from a disk. When an application asks for information from the hard disk, the cache program first checks to see if that data is already in the cache memory. If it is, the disk cache program loads the information from the cache memory rather than from the hard disk. If the information is not in memory, the cache program reads the data from the disk, copies it into the cache memory for future reference, and then passes the data to the requesting application. This process is shown in the accompanying illustration. A disk cache program can significantly speed most disk operations. Some network operating systems also cache other often accessed and important information, such as directories and the file allocation table (FAT).


Write-back Cache
A technique used in cache design for writing information back into main memory. In a write-back cache, the cache stores the changed block of data, but only updates main memory under certain conditions, such as when the whole block must be overwritten because a newer block must be loaded into the cache or when the controlling algorithm determines that too much time has elapsed since the last update. This method is rather complex to implement, but is much faster than other designs.


Write-through Cache
A technique used in cache design for writing information back into main memory. In a write-through cache, each time the processor returns a changed bit of data to the cache, the cache updates that information in both the cache and in main memory. This method is simple to implement, but is not as fast as other designs; delays can be introduced when the processor must wait to complete write operations to slower main memory.



Why is the Cache Important?



The main reason cache is important is that it increases the real speed of a processor by providing the processor with data more quickly. A processor can only crank through data if it is being given data, and any delay that exists between when the processor requests data and when the processor receives it means that clock cycles are being left idle which could have otherwise been used.

That said, summing up the importance of cache in a general way is difficult to do. Cache performance has traditionally been ignored because until very recently it did not have a major impact. The cache for most Pentium 4 series processors, for example, was only 256 or 512 kilobytes, depending on the processor. Having some cache was useful, but it did not have major impact on application performance because there was not enough cache for applications to make use of. For example, the original Celeron had no cache integrated into the processor, but it still performed extremely well in games compared to processors which did include cache. This was partly because cache sizes were not large enough to hold more than extremely common and general sets of data and partly because games generally don't make much use of CPU cache because there the amount of frequently used data far exceeds what is available.

However, cache has become more important in modern processors thanks to better implementations and an increase in cache size. This increase in the important of cache is probably due to the problems that became evident in Intel's Pentium 4 line-up. The Pentium 4 processors relied heavily on high clock speeds, and as a result they were power-hungry and ran hot. Intel learned from this when it created the Core 2 Duo series, which used lower clock speeds but made up for it with multiple cores and a much larger cache. These days, it is not unusual to find a processor with an L2 cache of 8MB. Cache has become important because it gives Intel and AMD a way to increase the performance of their processors. Cache goes hand-in-hand with the trend towards processors with multiple cores. Both the Phenom II and the Core i7 possess larger cache sizes then the Phenom and the Core 2 architectures, and the trend of increasing cache size is likely to continue.

Read more...

Download Acer 4535 Driver for Vista

The Aspire 4535 provides entertainment enjoyment with high-def visuals and pleasurable surround sound. True cinematic 16:9 aspect ratio and 1366 x 768 pixel resolution generate widescreen, HD enjoyment of today's best entertainment and 8 ms high-def response time delivers high-quality moving images.








BENEFITS

Cinematic fun
True cinematic 16:9 aspect ratio and 1366 x 768 pixel resolution generate widescreen, HD enjoyment of today's best entertainment.
Powerful performance
Advanced features deliver increased security, speedy multitasking and efficient performance.
Superior control
Revolutionary new features give you unprecedented control over your entertainment, work and life!



DRIVERS


Click here to download Realtek Audio Driver
Click here to download BTW Broadcom
Click here to download Bison Camera
Click here to download Suyin camera
Click here to download Realtek Cardreader Driver
Click here to download Nvidia Chipset Driver
Click here to download Fingerprint AS Authentec
Click here to download Broadcom Gigabit LAN Driver
Click here to download Launch Manager for Vista
Click here to download Foxconn Modem Driver
Click here to download Touchpad Synaptics Driver
Click here to download Nvidia VGA Driver
Click here to download Atheros BGN-77H053 Driver
Click here to download Foxconn WLAN T60H976 BG Driver
Click here to download Foxconn WLAN T77H030 BG Driver
Click here to download QMI WLAN Driver

Read more...

Download Acer Ferrari 1100 Drivers

The Ferrari 1100 takes ultra-portability into the fast lane. Driven by the revolutionary performance of dual-core processing power and wrapped in an exclusive, ultra-lightweight case, the Ferrari 1100 combines the unique style of race-bred innovation with the leading edge of mobile technology to bring your digital media to life wherever you are.







BENEFITS

Designed for ultra-performance
Beneath the carbon fibre casing of the Ferrari 1100 lays the latest AMD Turion 64 X2 dual-core mobile technology that not only sets the Ferrari 1100 apart from the competition for mobile performance and long-lasting battery life but also adds style, strength and lightweight flexibility to your mobile world. For an even greater mobile multimedia experience, the Ferrari 1100 also features an integrated slot-loading DVD SuperMulti DL and Dolby certified technology for cinema style surround sound capabilities.

Faster, sleeker, lighter, safer
The Ferrari 1100 takes mobile communication to entirely new levels, combining full wired and wireless Draft-n connectivity options with an Acer Crystal Eye video camera concealed at the top of the 12.1” WXGA display for real-time video conferencing on the move. With Acer DASP technology providing system shock protection along 3-axes and Acer Bio-Protection fingerprint solution for enhanced network and data security, the Ferrari 1100 is a faster, sleeker, lighter and safer way to take your digital world with you.

DRIVERS
Download Ferrari 1100 Drivers for Windows XP:
Click here to download ATI SM Bus
Click here to download ATI Radeon X1250 Mobility Driver
Click here to download AMD Processor Driver
Click here to download Realtek Audio Driver
Click here to download Atheros Wireless Driver
Click here to download Broadcom Wireless Driver
Click here to download Broadcom Gigabit LAN Driver
Click here to download Agere Modem Driver
Click here to download Conexant Modem Driver
Click here to download Bison Camera Driver
Click here to download Suyin Camera Driver
Click here to download Broadcom Bluetooth Driver
Click here to download Acer BioProtect Fingerprint Driver
Click here to download Realtek Cardreader Driver
Click here to download Launch Manager
Click here to download Touchpad Synaptic Driver

Download Ferrari 1100 Drivers for Windows Vista:
Click here to download ATI VGA Filter Driver
Click here to download ATI Radeon X1250 Mobility Driver
Click here to download Realtek Audio Driver
Click here to download Atheros Wireless Driver
Click here to download Broadcom Wireless Driver
Click here to download Broadcom Gigabit LAN Driver
Click here to download Agere Modem Driver
Click here to download Conexant Modem Driver
Click here to download Bison Camera Driver
Click here to download Suyin Camera Driver
Click here to download Broadcom Bluetooth Driver
Click here to download Finger Authentic Driver
Click here to download Realtek Cardreader Driver
Click here to download Launch Manager
Click here to download Touchpad Synaptic Driver
Click here to download AcerVCM

Read more...

The New Microsoft Office 2010

Microsoft Office 2010 (codename Office 14) is the next version of the Microsoft Office productivity suite for Microsoft Windows. It entered development during 2006 while Microsoft was finishing work on Microsoft Office 12, which was released as the 2007 Microsoft Office System.
It was previously thought that Office 2010 would ship in the first half of 2009, but Steve Ballmer has officially announced that Office 2010 will ship in 2010. The new version will implement the ISO compliant version of Office Open XML which was standardized as ISO 29500 in March 2008.

The software giant plans to offer a Web-based version of its Office productivity suite, known as Office Web that will debut with the release of Office 2010. Office Web will include online versions of Word, Excel, PowerPoint and OneNote. The next versions of Microsoft Office Visio, OneNote, Microsoft Office Project, and Publisher will feature the ribbon interface element used in other Office 2007 applications.

Builds
Earlier this year, on April 15, 2009, Microsoft confirmed that Office 2010 will be officially released in the first quarter of 2010. They also announced on May 12, 2009 at an event, that Office 2010 will begin technical testing during July. It will also be the first version of Office to ship in both 32-Bit and 64-bit versions.

New features
At present, the technical preview of Office 2010 would not be available as it will be starting from July onwards, but when Microsoft used it to show improved integration with Windows 7 at its event this week, various experts find out some of the new features of the new Office.


  • The ribbon surface was introduced in Office 2007 and it is here to stay, but now the Office 2010 ribbon gets the flatter Windows 7 look. Ribbon is a form of toolbar, combined with a tab bar.
  • In the new application, instead of the round Office icon, there is a button in each app that looks like the buttons for the different tabs on the ribbon, in the signature colour of that application (green for Excel, blue for Word, red for PowerPoint, yellow for Outlook and so on).
  • The app windows, which were removed in Office 2007, get their window menus back. The app let you restore, move, size, minimise, maximise or close the window.
  • Now with a right-click on the Outlook icon on the taskbar, user can create a new email message, appointment, contact or task, or open the inbox, calendar, contact or tasks window.
  • In the new office, you can drag a file from the jump-list of any application onto the navigation pane in Outlook to create a new mail message with the file as an attachment. Outlook can automatically ignore all the replies of the messages you are not interested in.
  • The company says user can create his own Quick Steps, making them sound like a combination of the Quick Access Toolbar and macros. For e.g., common tasks like filing a message, creating a meeting, marking a flagged message and deleting it at the same time. Also, user can create quick steps for things he do over and over again in Outlook.
  • In new Office outlook, voice mail can be converted to the text and sent as an email. Also, it can sync text messages from a Windows Mobile 6.1 phone into user’s inbox and reply to them.
  • When you reply all to messages you were BCC on, a warning is displayed on the desktop. Redmond based company claims that starting, searching and shutting down Outlook are all much faster and also promises that Outlook will run and search faster than the 2007 release.
  • At present, there is no news on the next version of Office for the Mac, but the next version of Outlook Web Access will run on Safari and Firefox with all features, rather than in a Lite version.
Read more...

What is Search Engine Optimization

When you are doing business online, one thing that you have to be very aware of is the importance of a search engine like Google or Yahoo. When you think about the fact that a very high number of purchases are made directly after the use of a search engine, you will soon see why making sure that you have the attention of search engine as well as having high ranking on the page is necessary.





Search engine optimization services will vary in what they do and how they do it, but their main goal is to make sure that your website gets the attention that they need. There are many different companies that will provide you with this valuable service, ranging from single freelancing individuals to full companies, but the important thing is that you find someone that you feel comfortable with. Dont be afraid to ask who they have worked with in the past and what industries they are most familiar with.

A search engine optimization company is one that will put a lot of time into researching your company and into thinking about the people who are looking for it. They will take a look and figure out how your prospective customers are trying to find you, and they will also think about ways to make it easier for them to do so. One tool that a company might use is keyword density; they will work with the text that you have on your site, and they will work with what words you use and how relevant you have made it in order to make your site more searchable. You will also find that they will work towards helping your site with things like link building, which will bring more people to your site from related areas as well as from the search engines themselves.

When you are looking for search engine optimization services, remember that you will want to use one that uses ethical means. For instance, you will find that one of the fastest ways to get traffic at your site is to deliberately post negative reviews and statements about other people or things. While this will bring you more traffic, it does not last and the people who come are not interested in your goods; at the end of the day, this is a very unethical way to bring people to your site! Take some time to figure out what techniques any search engine optimization service that you are considering will use.

Many people want their website to be more noticeable, but they are often unsure about how to go about it. There are, however, plenty of search engine optimization services out there that are willing to offer you their services, so ensure that you think about what is available, and how to make your website really shine!


Using Google for Search Engine Optimization


Google search engine optimization is a very complicated thing. Interestingly enough, Google gives you all the tools you need. Search engine optimization (SEO) is important as most of the traffic to an article or website will come from the search engines. The following are the steps you can take in order to make use of the tools provided by Google for Search Engine Optimization.

  1. Adwords is one of the most useful tools on the web for Google Search Engine Optimization. Adwords will allow you to pull an estimated average cost per click for a specific keyword in addition to providing traffic information to that keyword. This means that you can find out how much money an advertiser has to pay for an add click and how often the keyword is searched for so that you can target both traffic keywords and your ads at the same time. Keep in mind that you, as the publisher, will receive only a fraction of the estimated cost per click when someone clicks the ad.
  2. Google search is a great tool also. If a ton of people are already targeting your keyword, then there is a lot of work to be done if you want to get onto the front page. Many authors will abandon a keyword if it is not an open idea; however it is probably better to write and then optimize your site or article through other optimization methods. This will still give you an idea how much additional Google search engine optimization needs to be done.
  3. Google trends is a tool for identifying popularity of keywords for your website or articles. It allows you to search for and compare keywords so that you can pick the best ones. (Ex, Car vs. Automobile.) This Google search engine optimization tool is part of the Google Labs tool set which includes many other tools for web developers.
  4. Analytics is a great resource. It allows you to identify traffic trends to your website and articles. This is best used by someone who has their own website as you have to put a piece of code on the site to be able to pull data. Traffic trending is important because often you may be able to change keywords in order to localize ads, making them more pertinent to the local that is viewing them.

Read more...

Fundamental Frekuensi Jaringan Nirkabel

Jaringan nirkabel atau wireless network dapat dibagi menjadi dua segmen besar: jarak-pendek dan jarak-jauh. nirkabel jarak-pendek (short-range wireless networks) menyinggung beberapa jaringan yang memiliki luas terbatas. Beberapa diantaranya yaitu local area network (LAN) seperti gedung perusahaan, sekolah, kampus, rumah dan lain sebagainya, sama juga dengan personal area network atau PAN yaitu beberapa komputer portabel saling berhubungan dalam satu cakupan komunikasi. Tipikal jaringan-jaringan tersebut beroperasi dengan landasan spektrum tak berlisensi (unlincensed spectrum) yang hanya digunakan untuk keperluan industri, ilmiah dan medis atau ISM (Industrial, scientific, medical).





Frekuensi-frekuensi jaringan jarak-pendek yang tersedia berbeda antara satu negara dengan negara lain. Frekuensi yang paling umum digunakan adalah 2,4 GHz, frekuensi tersebut hampir digunakan di seluruh dunia. Ada juga frekuensi lain yang juga sering dipakai semisal 5 GHz dan 40 GHz. Ketersediaan frekuensi-frekuensi tersebut memungkin pengguna untuk mengoperasikan jaringan nirkabel tanpa harus mendaftarkan ke lisensi atau harus bayar.

Jaringan jarak-jauh (long-range wireless networks) melebihi cakupan LAN. Sisi konektivitas umumnya disediakan oleh beberapa perusahaan yang menjual konektivitas nirkabel sebagai sebuah jasa layanan mereka. Jaringan tersebut membentang luas semisal mencakup area kota, kabupaten, provinsi bahkan satu negara. Tujuan jaringan jarak-jauh tersebut adalah untuk menyediakan area cakupan (coverage) wireless secara global. Jaringan jarak-jauh yang umum dipake ialah Wireless Wide Area Network (WWAN). Saat cakupan global benar-benar dibutuhkan, komunikasi memakai jaringan satelit juga dimungkinkan.

Proses dalam menghantarkan jaringan nirkabel pada long-range wireless networks bernama modulasi (modulation). Ada beberapa teknik modulasi yang kesemuanya memiliki kelebihan dan kekurangan dalam hal efisiensi dan kebutuhan daya. Teknik-teknik modulasi tersebut antara lain:

1. Narrowband Technology (pita sempit)
Mengirim dan menerima data pada frekuensi radio yang spesifik. Frekuensi pita ditekankan sesempit mungkin namun memungkinkan data masih bisa dikirim. Interferensi dapat dicegah dengan mengkoordinasi user-user yang berbeda pada frekuensi-frekuensi yang berbeda-beda pula. Radio penerima menyaring semua sinyal kecuali sinyal yang dikirim melalui frekuensi yang disepakati. Penggunaan ini biasanya harus memiliki perizinan dari pemerintah setempat. Contohnya perusahaan yang memiliki cakupan usaha yang sangat luas.

2. Spread spectrum
Secara desainnya, spread spectrum mengobankan efisiensi bandwidth untuk reliabilitas, integritas dan sekuritas. Memakan banyak bandwidth dibanding narrow-band technology, namun memproduksi sinyal lebih tajam dan mudah dideteksi oleh penerima dimana mengetahui sinyal itu ketika sinyal dalam keadaan broadcast. Ada dua variasi radio spread-spectrum: Frequency-hopping spread spectrum (FHSS) dan Direct-sequence spread spectrum (DSSS).

3. Orthogonal Frequency Division Multiplexing (OFDM)
OFDM mengirim data dalam metode paralel, yang bertentangan dengan Hopping Technique yang digunakan oleh FHSS dan Spreading Technique yang digunakan oleh DSSS. Hal Ini melindungi data dari gangguan sewaktu sinyal sedang dikirim melalui frekuensi paralel. OFDM memiliki 'ultrahigh spectrum eficiency', yang berarti lebih banyak data yang dapat dikirim melalui jumlah bandwidth yang lebih kecil. Hal tersebut efektif untuk transmisi data yang tinggi. OFDM adalah bahwa lebih sulit untuk diterapkan daripada FHSS atau DSSS, dan memakan daya terbesar.
Read more...

Sekilas Tentang Enkripsi File

Encrypting File System (EFS) atau sistem pengekripsian file bisa dikatakan suatu fenomena dimana user dapat mengenkrepsi (dalam hal ini berkaitan dengan perlindungan data) file dan folder bahkan semua data di drive yang diformat NTFS. Sangat cocok untuk mengamankan data-data yang dianggap sensitif dalam hal keamanan terutama untuk komputer-komputer portabel. Dan juga sangat baik untuk mengamankan data ketika komputer kita dishare ke beberapa pengguna.




File yang terenkripsi benar-benar dalam keadaan rahasia karena EFS menggunakan metode enkripsi yang kuat dengan algoritma standar industri dan kriptografi kunci umum (public key cryptography).

EFS dapat mengeset beberapa permissions atau perizinan pada file dan folder di dalam partisi NTFS dimana dapat mengontrol akses ke beberapa file dan folder. Dan perlu sekali lagi diingat EFS hanya berlaku untuk file sistem NTFS dalam hal ini hanya dapat diterapkan pada sistem operasi Windows 2000, Windows XP, Windows Vista dan yang lebih baru.

Karakteristik EFS
  • EFS bisa diaktifkan secara default dan hanya user yang mempunyai wewenang dalam menggunakan EFS kendati berhubungan dengan kunci publik atau private.
  • Membutuhkan Recovery Agent Certificate untuk bekerja
  • File terenkripsi dapat dibagi ke beberapa pengguna
  • Enkripsi dapat hilang ketika file EFS dipindahkan ke sistem file yang berbeda (contohnya ke sistem file FAT)
  • Karakteristik yang paling penting ialah ketika mengopi file ke folder terenkripsi, maka secara otomatis file yang dikopi tadi ikut terenkripsi.
  • Pengenkripsian file terdapat pada atribut file
  • EFS dapat mengekripsi atau mendekripsi file di komputer jarak jauh, baik ketika offline atau pada keadaan roaming
  • File terenkripsi dapat ditaruh atau dibackup ke folder web.
  • Pengguna tidak dapat mengekripsi folder atau file sistem.



EFS sebenarnya terintegrasi penuh dengan NTFS, dimana menggantikan sistem file FAT yaitu sistem file pada versi-versi lawas Microsoft Windows. Proses pengenkripsian dan atau pendekripsian file merupakan hal yang transparan bagi user-user, yang berarti ketika pengguna menyimpan file, EFS mengekrepsi data manakala data tersebut ditulis ke disk, dan ketika pengguna membuka file, maka maka secara langsung EFS mendekrpisi data dari disk . Jika pengguna tidak memiliki kunci, maka akan menerima pesan penolakan berupa "Access denied".

Ada beberapa teknologi dan program dari pihak ketika yang mampu dalam hal pengekrpsian data namun kadang kala program-program tersebut tidak sepenuhnya transparan ke beberapa user. EFS harus dimengerti, diimplementasikan secara benar dan dikelola secara efektif untuk memastikan bahwa data-data Anda yang ingin dilindungi benar-benar tidak dalam keadaan berbahaya. EFS merupakan nilai tambah untuk pengelolaan perlindungan data Anda. Tapi sekali lagi harus benar-benar dikelola dan digunakan secara baik dan benar.


Read more...

Introducing: Windows Azure

Windows Azure, codenamed “Red Dog” is a cloud services operating system that serves as the development, service hosting and service management environment for the Azure Services Platform. It was launched by Microsoft in 2008. It is currently in Community Technology Preview. Commercial availability for Windows Azure will likely be at the end of calendar year 2009.


Google and Amazon have their infrastructure already setup for the cloud computing software therefore Microsoft also jumping on the bandwagon. The new operating system provides developers with on-demand compute and storage to host, scale, and manages Web applications on the Internet through Microsoft data centers.

Windows Azure is an open platform that will support both Microsoft and non-Microsoft languages and environments. Developers can use their existing Microsoft Visual Studio 2008 expertise to build applications and services on Windows Azure. It supports popular standards and protocols including SOAP, REST, XML, and PHP. Windows Azure also welcomes third party tools and languages such as Eclipse, Ruby, PHP, and Python.

Uses:
  • It can add Web service capabilities to existing packaged applications
  • With minimal on-premises resources, developers can build, modify, and distribute applications to the Web
  • They can create, test, debug, and distribute Web services quickly and inexpensively
  • Reduce costs of building and extending on-premises resources
  • Reduce the effort and costs of IT management
Azure Services Platform
Azure Services Platform is an application platform in the cloud that allows applications to be hosted and run at Microsoft datacenters. It provides an operating system and a set of developer services that can be used individually or together. The platform’s open architecture gives developers the choice to build web applications, applications running on connected devices, PCs, servers, or hybrid solutions offering the best of online and on-premises.


Other Azure Services
Live Services: These Services are a set of building blocks within the Platform for handling user data and application resources. It provides developers with an easy on-ramp to build rich social applications and experiences, across a range of digital devices that can connect with one of the largest audiences on the Web.

Microsoft SQL Services: These services can store and retrieve structured, semi-structured, and unstructured data. It provides Web services that enable relational queries, search, and data synchronization with mobile users, remote offices and business partners.

Microsoft .NET Services: These services include access control to help secure your applications, a service bus for communicating across applications and hosted workflow execution.

Microsoft SharePoint Services & Dynamics CRM Services: With the flexibility to use familiar developer tools like Visual Studio, developers will be able to rapidly build applications that utilize SharePoint and CRM capabilities as developer services for their own applications.

Benefits from the Azure Services Platform
As the platform offers the greatest flexibility, choice, and control in reaching users and customers while using existing skills, it also help developers easily create applications for the web and connected devices.

Easy way to the cloud: Currently developers in worldwide are using the .NET Framework and the Visual Studio development environment. Now with the new operating system they can utilize those same skills to create cloud-enabled applications that can be written, tested, and deployed all from Visual Studio. In the near future developers will be able to deploy applications written on Rubyon Rails and Python as well.

Enable quick results: the Service’s applications are very fast and the changes could be made quickly and without downtime. Therefore it will be an ideal platform for affordably experimenting and trying new ideas.

New Opportunities: it enables developers to create cloud based web, mobile, or hybrid-applications combined with Live Services, which has the ability to reach over 400 million live users, new opportunities exist to interact and reach users in new ways.

*) Amarpreet97
Read more...

Network Subnetting: Divide your network

Subnetwork is a logical division of a local area network, which is created to improve performance and provide security. It describes networked computers and devices that have a common, designated IP address routing prefix. Subnetting is used to break the network into smaller more efficient subnets to prevent excessive rates of Ethernet packet collision in a large network. To enhance performance, subnets limit the number of nodes that compete for available bandwidth and such subnets can be arranged hierarchically, into a tree-like structure. Routers are used to manage traffic and constitute borders between subnets. In an IP network, the subnet is identified by a subnet mask, which is a binary pattern that is stored in the client machine, server or router.

The advantages of Subnetting a network are:

  • Through subnetting, user can reduce network traffic and thereby improve network performance. User only allows traffic that should move to another network (subnet) to pass through the router and to the other subnet.
  • It can be used to restrict broadcast traffic on the network.
  • It facilitates simplified management. User can delegate control of subnets to other administrators.
  • Troubleshooting network issues is also simpler when dealing with subnets than it is in one large network.

Implementing Subnetting

For implementing the subnetting scheme, the user should keep the important factors clarified while determining the requirements like, the number of required network IDs, which is needed for each subnet, and for each WAN connection and the number of required host IDs, which is needed for each TCP/IP based network device.

Using the information above, user can create a subnet mask for the network, a subnet ID for every physical network segment and a range of host IDs for every unique subnet.

In the process of subnetting, the bit position taken from the host ID reduces the number of hosts by a factor of 2. For example, in a Class B network, you can have 65,534 possible host addresses or IDs. So, if you start subnetting the number of hosts which will be about half that figure, i.e., 65,534 / 2.

Network address and logical address

The term network address either refers to logical address, i.e. network layer address such as the IP address, or to the first address (the base address) of a class-full address range to an organization. PCs and devices, which are part of an internetworking network for e.g. Internet, each have a logical address.

The network address is unique to each device and can either be dynamically or statically configured. An address allows a device to communicate with other devices connected to a network. The most common network addressing scheme is IPv4.

An IPv4 address consists of a 32 bit address written, for human readability, into 4 octets and a subnet mask of like size and notation. In order to facilitate the routing process the address is divided into two pieces viz. the network prefix that is significant for routing decisions at that particular topological point and the network host that specify a particular device in the network.

The primary reason for subnetting in IPv4 was to improve efficiency in the utilization of the relatively small address space available, particularly to enterprises.

Subnetting in IPv6 networks

Subnetting is also used in IPv6 networks. But, in IPv6 the address space available even to end-users is so large that address space restrictions no longer exist.

In IPv6, the recommended allocation for a site is an address space comprising 80 address bits and for residential customer network, it may be as small as 72 bits. This provides 65,536 subnets for a site, and a minimum of 256 subnets for the residential size.

An IPv6 subnet always has a /64 prefix which provides 64 bits for the host portion of an address. Although it is technically possible to use smaller subnets, but they are impractical for LAN as stateless address auto-configuration of network interfaces (RFC 4862) requires a /64 address.
Read more...

Auslogics: How to Fix 3 Most Common Registry Errors

The Windows Registry is one of the most important parts of a computer. It stores information about the system - hardware, operating system software, user settings, and other software. Without the Registry your computer wouldn’t run.

Windows is very sensitive to Registry errors – if there are too many of them, your PC becomes slow, unstable, and might even fail to boot. Registry errors can affect your hardware as well and you’ll have no choice but to spend a lot of money on a new computer. Fortunately, there is an easier solution – Registry repair using a free Windows Registry cleaner from Auslogics.


It can prove handy to know what exactly causes the Registry to slow down. Here are 3 most common Registry errors:

1. Slow Registry Access

If your computer is slow on startup, freezes up too often and is sluggish, it usually means that the Registry is way too bloated. Some Registry cleanup should make your PC run as fast as ever. You can try fixing the Registry manually, but I wouldn’t advise that, because you can accidentally delete a vital entry and seriously harm your computer. A better way is to use a free Auslogics Registry Cleaner.

2. Obsolete Entries

When you uninstall software, some programs don’t get uninstalled properly and leave invalid Registry keys behind. The more often you install and uninstall stuff, the more invalid entries there are. As a result the system needs more time to access the Registry and find the entries it needs. Naturally, this causes drastic computer slow-downs. That’s another reason why you need a Registry cleaner – to delete those entries. Auslogics Registry Cleaner offers you free Registry repair and is extremely easy and safe to use. Unlike other Registry cleaners it doesn’t try to impress you with the number of errors found - Auslogics Registry Cleaner will only detect and offer to delete ‘green’ items and won’t delete valid system entries.

3. Invalid File Extension

Another Registry error that occurs pretty often is “Invalid file extension”. You get this error when Windows can’t associate a file extension with a file type. Each file association includes the ProgID of a program that is used to handle files of that type. If a file type is associated with an invalid ProgID, you won’t be able to handle files of that file type. You’ll need to apply a Registry fix to sort it out – find the invalid extension and delete it.

Why Auslogics Registry Cleaner?

There are many Registry cleaners for Windows out there. Most of them cost around US $30. But having to pay for them doesn’t mean they are good - a lot of those commercial programs are hardly the best Registry cleaners and can damage your computer. And why spend money when there is a great free Windows Registry repair tool – Auslogics Registry Cleaner. This program is fast, safe, and spyware-free. Its intuitive interface, ease of useBusiness Management Articles, and the Advanced scan option make it ideal for both novice and power users.

Remember to clean the Registry at least once a month and your computer will run as fast and stable as ever.
Read more...

Understanding Antivirus and Antispyware

Antivirus Software is a computer program designed to eliminate computer viruses and other malware. Nowadays, antivirus scan programs have the ability to neutralize all kinds of threats including worms, Trojan horses, etc and serve as a catalyst in virus and spyware removal.


Antivirus software programs that are commonly available in market are Norton, McAfee, and SOPHOS. Among all, Norton continues to be the bestselling antivirus program, and has proved its mettle in detecting viruses.

Likewise, there are super antispyware software programs. They are used solely for detection and removal of spyware. These software programs follow different techniques to complete their task. They either scan files to identify known viruses matching definitions in their virus dictionary or track suspicious behavior shown by files and programs stored in computer. For examples: A program trying to write data to an executable program can create doubt. However, such a diagnosis generally covers data captures, port monitoring, and other tricks.

Always install such an anti-spyware program that works effectively with your operating system. Its main advantage lies in its efficiency to provide high quality protection against complex spyware attacks. Most of these programs offer complete security by scanning not only e-mails, but also all traffic generating from diskettes, USB sticks, and Internet.

There are specific antivirus programs that the user can easily install. Also, they are available for free on the Internet. But it is always recommended to buy a software package as it will have additional features for an efficient PC protection.

Some antispyware software can reduce your PC’s performance to a large extent. You may disable the antivirus protection to avert performance loss but it will increase the risk of infection. If you are unsure of which Antispyware program to buy, you can consult with technicians offering computer support online. Apart from providing the best advice, they can install the most suitable Antispyware in no time.

Many companies offer services like installation and timely up gradation of antivirus software program, troubleshooting PC errors, email support services, support for all operating systemsFeature Articles, and other related benefits as a part of their computer support package.

By Joseph Jhon, a technical expert and virus remover.
Read more...

Crossfire X vs SLI

New cards means another new round in the ongoing multi-GPU battle.

With the launch of the HD4800 series, AMD is again forcing its competitor, NVIDIA to go toe-to-toe on price, rather than performance. The GT200 dominates everything that has come before in a straight one-on-one battle, but the Radeons are once more ganging up and attacking en masse. AMD's argument is that many low-cost cards working together are better than one big expensive one.


Viewed this way, CrossFire - as AMD's multi-card tech is known - makes more sense than the competing SLI. After all, why take two NVIDIA cards into the shower when one will do? A single GTX 280 will easily outperform anything else on the market without needing to be paired up. On the other hand, anyone who bought a GeForce 8800GT last year - and there were loads of us - will surely be watching as the price for a suitable partner tumbles.

While it's still being presented as a revolutionary idea, we're used to hardware zerging like this now. Indeed, it's a cheeky move by AMD to claim for its own the territory that NVIDIA first broached with SLI, all those years ago. The question is, with single CPUs getting ever more powerful and games engines standing relatively still, is this so much smoke and marketing mirrors?

Both companies use similar techniques to get their cards working together in harmony. Games are -wherever possible - profiled for the best possible performance increases. By default, the drivers use Alternate Frame Rendering (AFR), where one card is used to render one frame, while the other card prepares the next frame. In rarer cases, split-frame rendering - where pixels from a single frame are load balanced between the two cards - will make a game run faster.

Some competition gamers swear by split-frame rendering, arguing that the minor latency introduced in AFR can affect fast-paced games, but for most of us the drivers will simply select AFR and we won't be any the wiser. Indeed, with AMD's control panel you won't have any choice; but while you can customize profiles for NVIDIA cards, it's unlikely you'll ever need to.

Both companies, too, require a hardware bridge to connect the cards together using internal connectors inside the PC. This gives a direct point of communication between the cards independent of the potentially congested PCI Express bus, but isn't fast enough to carry all the data they need to share. So, are you better off going for the very best single GPU card you can lay your hands on, or should you look for a more arcane arrangement of graphics chips? And if you do, should you opt for SLI or CrossFire?

Back to Basics

A superficial glance back at the last 12 months and the answer would seem to favor multi-GPU arrays. NVIDIA's 9800GX2 - two G92 chips on one card - reigned supreme in the performance stakes up until the launch of the GTX 280. By coupling two GX2s together you got the Quake-like power-up of Quad SLI, and framerates that would make your eyes bleed.

AMD, meanwhile, stuck to its guns and released the HD3870X2, a dual-chip version of its highest-end card. In the same kind of performance league as a vanilla 9800GTX, it may not have been elegant but it was great value for money.

That's just the off-the-shelf packages. With the right motherboard two, three or even - in AMD's case - four similar cards can be combined to create varying degrees of gaming power. AMD also had a paper advantage with the fact that HD3850s and HD3870s could be combined together in configurations of up to four cards too.

Both companies even went as far as to release Hybrid SLI and Hybrid CrossFire, matching a low-end integrated graphics chip with a low-end discrete graphics chip. The result in both was much less than the sum of their parts: two rubbish GPUs which, when combined, were still poor for gaming.

And right there, at the very bottom, is where the argument for multi-GPU graphics starts to steadily unravel. Despite all the time that's passed since SLI first reappeared, the law of diminishing returns on additional graphics cards remains. Unfortunately, two cards are not twice as fast as one card, and adding a third card will often increase performance by mere single figure percentages.

That, of course, is if they work together at all. Even now, anyone going down the multiple graphics route is going to spend a lot more time installing and updating drivers to get performance increases in their favorite games. Most infamously, Crysis didn't support dual-GPUs until months after its release, and even then it still required a hefty patch from the developers to get two cards to play nicely together. It's now legend that the one game that could really have benefited from a couple of graphics cards refused point blank to make use of them.

That's very bad news for owners - or prospective owners - of GX2 or X2 graphics cards, which require SLI or CrossFire drivers; so another strike then for the single card. It would be churlish to say things haven't improved at all recently, but suffice to say that in the course of putting together this feature, we had to re-benchmark everything three times more than is normally necessary, because driver issues had thrown up curious results.

Before you even get to installing software, though, there's a bucket load of other considerations to take into account. First and foremost is your power supply: people looking to bag a bargain by linking together two lower end cards will often find that they will have to spend another $l00 or so on a high-quality power supply that's capable of not only putting out enough juice for their system, but has enough six or eight pin molex connectors for all the graphics cards, too.

Many is the upgrader who's witnessed the dreaded 'blue flash of death' when the PC equivalent of St Elmo's Fire indicates that the $30 power supply that you thought was a bargain is, in fact, destined for a quick trip to the recycling centre.

Even more critical with the current generation of cards, though, is heat dissipation. All of AMD's HD4800 series can easily top 90°C under load, and a couple of cards in your PC will challenge any amount of airflow you've painstakingly designed for. To make matters even worse, many motherboards stack their PCI-Express ports so closely together the heatsinks are almost touching. The HD3850s are single slot cards, but that means they vent all their heat inside the case.

On the NVIDIA side of things, size is more of an issue. The new cards - both GTX 260 and GTX 280 - are enormous. Even though they're theoretically able to couple up on existing motherboards, it's unlikely you'll find one with absolutely nothing at all - not even a capacitor - protruding between the CPU socket and the bottom of the board.

Because the merest jumper out of place will prevent these two sitting comfortably. If all this is beginning to sound a little cynical, let's point out that there have been bright developments in recent history. Most notable is the introduction last year of PCI-Express 2.0, which means there are more motherboards out there with at least two PCI-e sockets that are fully l6x bandwidthFind Article, so it's easier to keep both cards fed full of game data at all times.

Read more...

Difference between Hub and Switch

Even though switches and hubs are both used to link the computers together in a network; a switch is known to be more expensive and the network that is built with switches is typically said to be a bit faster than one that is rather built with hubs. The reason for this is because once a hub gets its data at one of the computer ports on the network, it will then transmit the chunk of data to all of the ports, before distributing it to all of the other computers on that network. If more than one computer on a single network tries to send a packet of data at the same time it will cause a collision and an error will occur.

When there is an error in the network all of the computers will have to go through a procedure in order to resolve the problem. It is quite a procedure as the entire process will have to be prescribed by the CSMA/CD (Ethernet Carrier Sense Multiple Access with Collision Detection). Every single Ethernet Adapter there is has their own transmitter and receiver, and if the adapters weren't required to listen using their receivers for collisions, then they would have been able to send the information while they are receiving it. But because they can only do one at a time, not more than one chunk of data can be sent out at once, which can slow the whole process down.

Due to the fact that they operate at only a half duplex, meaning that the data may only flow one way at a time, the hub broadcasts the data from one computer to all the others. Therefore the most bandwidth is 100 MHz, which is bandwidth that is to be shared by all of the computers that are connected within the network. Then, as a result of this, when someone making use of a PC on the hub wants to download a big file or more than one file from another PC, the network will then become crowded. With a 10 MHz 10 Base-T type of network, the effect here is to slow down the network to a crawl.

If you want to connect two computers, you can do so directly in an Ethernet using a crossover cable. With one of these crossover cables you will not have a problem with collision. What it does is it hardwires the transmitter of the Ethernet on the one PC to the receiver on the other PC. Most of the 100 Base-TX Ethernet adapters are able to detect when looking out for certain collisions that it is not necessary by using a system called the auto-negotiation, and it will run in a complete duplex manner when it's needed.

What this ends in is a crossover cable that doesn't have any delays that would be caused by collisions, and the data can be directed in both ways at the same time. The maximum bandwidth allowed is 200 Mbps, meaning 100 mbps either way.

Read more...

How Cleaning Computer Registry Files Will Fix Your Slow Computer

Most of us use a Windows operating system to run our computers, and one of the most basic of all the systems in the operating system is the registry. A Windows registry is the database where all your files needed to run the computer are stored.




As we all know, we are constantly installing programs, new tweaks, upgrades, etc. This is how the registry starts to fill up. In fact, whenever you visit a website, files get stored in the registry as well.

There will come a time when the registry fills up, and cause the computer to slow down, or even freeze. When this happens, the best way to fix it is to clean the registry. This means cleaning computer registry files.

It is possible to do this yourself, but unless you know exactly how to do it, you shouldn't. You know how sometimes we accidentally delete a file, and then retrieve it through the recycle bin? Well, in the registry, once you delete something, it stays deleted, and this is what makes manual cleaning a delicate task.

Your best shot would be to get a registry cleaning program. Of all the registry cleaning programs online, look for one that has a complete scanning system, ability to search and find the errors, and provide you a report.

The most reliable program for cleaning computer registry files will also be able to shut down programs that you rarely or never use, and this will enable your computer to run faster since the registry has less items to process.

Finally, the best registry cleaning program has to have an automatic scheduling system for maintenance and repair. This way, your registry will always be free from errors. Known registry fixer programs are Tune-Up Utilites, RegCure, etc.
Read more...

Windows: Renaming the Recycle Bin

The Recycle Bin is a great feature of Windows, but it is very difficult to customize the name. Unlike other system icons on the desktop, you cannot just right-click it and select Rename as usual.




The only way to rename the Recycle Bin is to hack the Registry.This is not as simple as the method for the other icons, but you can easily get through it.To get started, let’s assume that you want to rename the Recycle Bin as 'WaDaH RuNtaH':

1. Click the Start button and select Run.

2. Then type regedit in the box and click OK.

3. When the Registry Editor is started, you will want to expand the HKEY_CURRENT_USER, Software, Microsoft,Windows, CurrentVersion, Explorer, CLSID, and {645FF040-5081-101B-9F08-00AA002F954E} folders.

4. When you have expanded the last folder, you should see an entry called (Default) that has no value. Right-click that entry and select Modify.

5. Next, just type WaDah RuNtaH, or any other name, in the box and click OK. If you want to hide the text under the Recycle Bin icon, you will still have to specify a name. Instead of typing in a word, just click the spacebar once and then click OK. You do not have to worry about entering in the ASCII code for a space when editing the registry.

6. Close the Registry Editor and press F5 when viewing your desktop to see your changes. If that does not work for your computer, then you will have to log out and log in to activate the changes.

Now your Recycle Bin name has changed! :)

Read more...

Flash: get_url tidak bekerja di blogger?

Banyak yang mengeluh, mengapa get_url file flashnya tidak bekerja setelah diupload dan dipasang di blogger jika salah satu link mengacu ke situs yang sama, namun bila merujuk ke situs yang lain kok malah bisa. Solusi ini dapat dipecahkan dengan menambahkan suatu properti di script <embed> flash nya.






Untuk bisa mengakses link ke halaman web yang sama, maka perlu ditambahkan satu properti yaitu allowscriptaccess di dalam tag embed, properti tersebut harus bernilai "always", sehingga link-link yang berupa symbol button itu dapat mengakses ke beberapa situs termasuk situs itu sendiri. Contoh salah satu file flash yang diembed adalah sebagai berikut:


Read more...

Cegah Virus dengan mematikan Autorun pada Vista

Akhir-akhir ini banyak sekali virus dan worm yang cepat sekali menyebar melalui media penyimpanan dan jaringan. Khususnya dalam media penyimpanan, virus/worm mudah sekali menginfeksi beberapa komputer melalui media tersebut semisal Flash Disk dengan bantuan fasilitas AutoRun pada sistem komputer tersebut. Nah, untuk mencegah virus tersebut menginfeksi komputer kita pada saat flash disk dipasang, maka fitur AutoRun harus dimatikan.





Mematikan fitur AutoRun pada Windows Vista

Sebetulnya ada beberapa cara untuk mematikan fungsi autorun yaitu dengan mengubah registry, menggunakan Local Group Policy Editor, menggunakan software pihak ketiga dan lain-lain. Kali ini kita akan menggunakan bantuan Local Group Policy Editor. Local Group Policy Editor hanya dapat dibuka secara manual, artinya kita tidak menemukan shortcut-nya baik di Programs ataupun di Control Panel.

Oleh karena itu, untuk membukanya cukup ketik gpedit.msc di Run. Mengingat Windows Vista tidak memasang Run di Start Menu-nya, maka untuk membuka Run cukup tekan tombol Windows + R. Setelah Local Group Policy Editor terbuka, di situ terdapat dua bagian konfigurasi, yaitu Computer Configuration dan User Configuration. Computer Configuration berfungsi untuk mengatur konfigurasi komputer secara keseluruhan, sedangkan User Configuration hanya berlaku untuk user yang sedang aktif saja.


Ok, untuk mematikan fungsi Autorun maka kita harus mengeset setting "Turn off Autoplay" ke enabled. Setting tersebut berada di dalam folder "AutoPlay Policies". Ingat! setting dan folder tersebut kedua-duanya berada di Computer Configuration dan User Configuration, sehingga disarankan untuk mematikannya di dua konfigurasi tersebut.


Pertama-tama buka Computer Configuration, lalu buka Administrative Templates kemudian Windows Components. Di dalam folder tersebut kita temukan folder AutoPlay Policies, buka folder tersebut, lalu double-click "Turn off Autoplay" untuk membuka properties-nya. Set ke Enabled, dan di bawahnya terdapat pilihan "Turn off Autoplay on:", pilih ke "All drives". Langkah-langkah tersebut harus dilakukan di User Configuration. Setelah selesai dikonfigurasi, cobalah colokkan flash disk atau masukkan CD-ROM Anda, apakah fitur autorun/autoplay keluar atau tidak. Jika tidak, maka langkah-langkah tersebut berarti berhasil dan Worm/Virus tidak langsung menginfeksi.
Read more...

Share Computers Business Directory - BTS Local Top Computers blogs DigNow.net Computer Blogs - BlogCatalog Blog Directory






Promote Your Blog


Locations of visitors to this page

Add to Technorati Favorites


sponsored by: