Search Results

Search found 28693 results on 1148 pages for 'oracle advanced security'.

Page 548/1148 | < Previous Page | 544 545 546 547 548 549 550 551 552 553 554 555  | Next Page >

  • Advanced example-driven C book with a lot of code.

    - by Inso Reiges
    Hello, I am looking for a book on advanced C programming that: Teaches how to effectively express one's solution in C when one already knows the language in depth. Shows some common design idioms expressed in C, like encapsulation, modularity and that kind of thing. Is example-driven with a lot of good-quality code. I already know the language itself so books like otherwise wonderful "Expert C Programming" by Peter van der Linden is not really what i am looking for. What i need is a book on how to express my design in C, what are the common idioms, best practices, etc. I would also like to note that i am primarily interested in C, not C++, C#, Objective-C or any other languages inspired by C-like syntax. Thank you.

    Read the article

  • Why do I get a security warning in visual studio 2008 when creating a project?

    - by MikeG
    This is the error, it's basically a security warning (And here's the text grabbed off the dialog box) Security Warning for WindowsApplication4 __________________________I The WindowsApplication4 project file has been customized and could present a security risk by executing custom build steps when opened in Microsoft Visual Studio. If this project came from an untrustwoithy source, it could cause damage to your computer or compromise your private information. More Details Project load options 0 Load project for browsing Opens the project in Microsoft Visual Studio with increased security. This option allows you to browse the contents of the project, but some functionality, such as IntelliSense, is restricted, When a project is loaded for browsing, actions such as building, cleaning, publishing, or opening designers could still remain unsafe. Load project normally Opens the project normally in Microsoft Visual Studio. Use this option if you trust the source and understand the potential risks involved. Microsoft Visual Studio does not restrict any project functionality and will not prompt you again for this project. Ask me for every project in this solution OK L Cancel When click the more details button get this: Microsoft Visual Studio __ An item referring to the file was found in the project file “C:\Users\mgriffiths\Documents\Visual Studio 2008\ProjectATemp\Win dowsApplication4\WindowsApplicdtion4\W in dowsApplication4.vbproj”. Since this file is located within a system directory, root directory, or network share, it could be harmful to write to this file. OK

    Read the article

  • How to open a VirtualBox (.VDI) Virtual Machine

    - by [email protected]
     How to open a .VDI Virtual MachineSometimes someone share with us one Virtual machine with extension .VDI, after that we can wonder how and what with?Well the answer is... It is a VirtualBox - Virtual Machine. If you have not downloaded it you can do this easily, just follow this post.http://listeningoracle.blogspot.com/2010/04/que-es-virtualbox.htmlorhttp://oracleoforacle.wordpress.com/2010/04/14/ques-es-virtualbox/Ok, Now with VirtualBox Installed open it and proceed with the following:1. Open the Virtual File Manager. 2. Click on Actions ? Add and select the .VDI fileClick "Ok"3.  A new Virtual machine will be displayed, (in this Case, an OEL5 32GB Virtual Machine is available.)4. This step is important. Once you have open the settings, under General option click the advanced settings. Here you must change the default directory to save your Snapshots; my recommendation set it to the same directory where the .Vdi file is. Otherwise you can have the same Virtual Machine and its snapshots in different paths.5. Now Click on System, and proceed to assign the correct memory and define the processors for the Virtual machine. Note: Enable  "Enable IO APIC" if you are planning to assign more than one CPU to the Virtual Machine.6. Associated the storage disk to the Virtual machineThe disk must be selected as IDE Primary Master. 7. Well you can verify the other options, but with these changes you will be able to start the VM. Note: Sometime the VM owner may share some instructions, if so follow his instructions.8. Click Ok and Push Start Button, and enjoy your Virtual Machine

    Read the article

  • How to open a VirtualBox (.VDI) Virtual Machine

    - by [email protected]
     How to open a .VDI Virtual MachineSometimes someone share with us one Virtual machine with extension .VDI, after that we can wonder how and what with?Well the answer is... It is a VirtualBox - Virtual Machine. If you have not downloaded it you can do this easily, just follow this post.http://listeningoracle.blogspot.com/2010/04/que-es-virtualbox.htmlorhttp://oracleoforacle.wordpress.com/2010/04/14/ques-es-virtualbox/Ok, Now with VirtualBox Installed open it and proceed with the following:1. Open the Virtual File Manager. 2. Click on Actions ? Add and select the .VDI fileClick "Ok"3.  A new Virtual machine will be displayed, (in this Case, an OEL5 32GB Virtual Machine is available.)4. This step is important. Once you have open the settings, under General option click the advanced settings. Here you must change the default directory to save your Snapshots; my recommendation set it to the same directory where the .Vdi file is. Otherwise you can have the same Virtual Machine and its snapshots in different paths.5. Now Click on System, and proceed to assign the correct memory and define the processors for the Virtual machine. Note: Enable  "Enable IO APIC" if you are planning to assign more than one CPU to the Virtual Machine.6. Associated the storage disk to the Virtual machineThe disk must be selected as IDE Primary Master. 7. Well you can verify the other options, but with these changes you will be able to start the VM. Note: Sometime the VM owner may share some instructions, if so follow his instructions.8. Click Ok and Push Start Button, and enjoy your Virtual Machine

    Read the article

  • The Cloud is STILL too slow!

    - by harry.foxwell(at)oracle.com
    If you've been in the computing industry sufficiently long enough to remember dialup modems and other "ancient" technologies, you might be tempted to marvel at today's wonderfully powerful multicore PCs, ginormous disks, and blazingly fast networks.  Wow, you're in Internet Nirvana, right!  Well, no, not by a long shot.Considering the exponentially growing expectations of what the Web, that is, "the Cloud", is supposed to provide, today's Web/Cloud services are still way too slow.Already we are seeing cloud-enabled consumer devices that are stressing even the most advanced public network services.  Like the iPad and its competitors, ever more powerful smart-phones, and an imminent hoard of special purpose gadgets such as the proposed "cloud camera" (see http://gdgt.com/discuss/it-time-cloud-camera-found-out-cnr/ ).And at the same time that the number and type of cloud services are growing, user tolerance for even the slightest of download delays is rapidly decreasing.  Ten years ago Web developers followed the "8-Second Rule", (average time a typical Web user would tolerate for a page to download and render).  Not anymore; now it's less than 3 seconds, and only a bit longer for mobile devices (see http://www.technologyreview.com/files/54902/GoogleSpeed_charts.pdf).  How spoiled we've become!Google, among others, recognizes this problem and is working to encourage the development of a faster Web (see http://www.technologyreview.com/web/32338/). They, along with their competitors and ISPs, will have to encourage and support significantly better Web performance in order to provide the types of services envisioned for the Cloud.  How will they do this? Through the development of faster components, better use of caching technologies, and the really tough one - exploiting parallelism. Not that parallel technologies like multicore processors are hard to build...we already have them.  It's just that we're not that good yet at using them effectively.  And if we don't get better, users will abandon cloud-based services...in less than 3 seconds.

    Read the article

  • Prognostications for the Future of BI

    - by jacqueline.coolidge(at)oracle.com
    Dashboard Insight has published the viewpoints on the future of BI from several vendors' perspectives including ours at Business Intelligence Predictions for 2011 We offered: In 2011, businesses will demand more from BI.  With intense competitive and economic pressures, it's not enough to be interesting.  BI must be actionable and enable people to respond smarter and faster to the opportunities and challenges of the day.  Most companies rely on BI to help them understand what's going on in their business.  Many are ready to make the leap from "What's going on?" to "What are we going to do about it?" Seamless integration from reporting to what-if analysis and scenario modeling helps businesses decide the right course of action.  The integration of BI with SOA and BPEL will deliver the true payoff for BI by enabling companies to initiate business processes directly from their analysis, turning insight to action for more agile and competitive business.  And, I must admit, it's tough to argue with the trends identified by other vendors. Enabling true self-service and engaging a larger community of users Accelerating the adoption of BI on mobile devices Embracing more advanced analytics such as data/text mining and location intelligence Price/performance breakthroughs It's singing to the choir.  I look forward to hearing the voices of some customers who are pushing the envelope and will post those stories as I capture them.  

    Read the article

  • Why Do Spreadsheets Not Work in an Enterprise Planning Environment ?

    - by Mike.Hallett(at)Oracle-BI&EPM
    “Around 93% of managers gather or analyze information in spreadsheets and 54% spend more time gathering information than analyzing it....”  Find answers in this Whitepaper: some extracts below: “Traditional budgeting and planning is a straight jacketed and hierarchical exercise.... how many businesses have planning and reporting processes that are smart, agile and aligned? The networked economy challenges the fundamentals of business organization, for example, where does the front-office stop and does the back-office start?  Is it still meaningful to plan for customer, channel, or product profitability, or is transaction profitability the only measure that counts? “Although conceptually, the idea of enterprise business planning is relatively straightforward it has proven to be illusive, because of over reliance on spreadsheet-bound processes, a lack of control over data quality/management, limited use of advanced planning tools and the cultural impediments that afflict many planning processes. “In the absence of specialist tools, businesses tend to opt for ‘broad brush’ assumptions in financial plans which merely approximate the more granular assumptions used in operational plans. “Most businesses are familiar with the relationship between risk and reward but in assessing potential opportunities and developing business plans rarely acknowledge risks and probability in a formal way. Get your customer to see how they do against the “Enterprise Business Planning Checklist”: get them to read the Whitepaper.

    Read the article

  • Five Key Strategies in Master Data Management

    - by david.butler(at)oracle.com
    Here is a very interesting Profit Magazine article on MDM: A recent customer survey reveals the deleterious effects of data fragmentation. by Trevor Naidoo, December 2010   Across industries and geographies, IT organizations have grown in complexity, whether due to mergers and acquisitions, or decentralized systems supporting functional or departmental requirements. With systems architected over time to support unique, one-off process needs, they are becoming costly to maintain, and the Internet has only further added to the complexity. Data fragmentation has become a key inhibitor in delivering flexible, user-friendly systems. The Oracle Insight team conducted a survey assessing customers' master data management (MDM) capabilities over the past two years to get a sense of where they are in terms of their capabilities. The responses, by 27 respondents from six different industries, reveal five key areas in which customers need to improve their data management in order to get better financial results. 1. Less than 15 percent of organizations surveyed understand the sources and quality of their master data, and have a roadmap to address missing data domains. Examples of the types of master data domains referred to are customer, supplier, product, financial and site. Many organizations have multiple sources of master data with varying degrees of data quality in each source -- customer data stored in the customer relationship management system is inconsistent with customer data stored in the order management system. Imagine not knowing how many places you stored your customer information, and whether a customer's address was the most up to date in each source. In fact, more than 55 percent of the respondents in the survey manage their data quality on an ad-hoc basis. It is important for organizations to document their inventory of data sources and then profile these data sources to ensure that there is a consistent definition of key data entities throughout the organization. Some questions to ask are: How do we define a customer? What is a product? How do we define a site? The goal is to strive for one common repository for master data that acts as a cross reference for all other sources and ensures consistent, high-quality master data throughout the organization. 2. Only 18 percent of respondents have an enterprise data management strategy to ensure that data is treated as an asset to the organization. Most respondents handle data at the department or functional level and do not have an enterprise view of their master data. The sales department may track all their interactions with customers as they move through the sales cycle, the service department is tracking their interactions with the same customers independently, and the finance department also has a different perspective on the same customer. The salesperson may not be aware that the customer she is trying to sell to is experiencing issues with existing products purchased, or that the customer is behind on previous invoices. The lack of a data strategy makes it difficult for business users to turn data into information via reports. Without the key building blocks in place, it is difficult to create key linkages between customer, product, site, supplier and financial data. These linkages make it possible to understand patterns. A well-defined data management strategy is aligned to the business strategy and helps create the governance needed to ensure that data stewardship is in place and data integrity is intact. 3. Almost 60 percent of respondents have no strategy to integrate data across operational applications. Many respondents have several disparate sources of data with no strategy to keep them in sync with each other. Even though there is no clear strategy to integrate the data (see #2 above), the data needs to be synced and cross-referenced to keep the business processes running. About 55 percent of respondents said they perform this integration on an ad hoc basis, and in many cases, it is done manually with the help of Microsoft Excel spreadsheets. For example, a salesperson needs a report on global sales for a specific product, but the product has different product numbers in different countries. Typically, an analyst will pull all the data into Excel, manually create a cross reference for that product, and then aggregate the sales. The exact same procedure has to be followed if the same report is needed the following month. A well-defined consolidation strategy will ensure that a central cross-reference is maintained with updates in any one application being propagated to all the other systems, so that data is synchronized and up to date. This can be done in real time or in batch mode using integration technology. 4. Approximately 50 percent of respondents spend manual efforts cleansing and normalizing data. Information stored in various systems usually follows different standards and formats, making it difficult to match the data. A customer's address can be stored in different ways using a variety of abbreviations -- for example, "av" or "ave" for avenue. Similarly, a product's attributes can be stored in a number of different ways; for example, a size attribute can be stored in inches and can also be entered as "'' ". These types of variations make it difficult to match up data from different sources. Today, most customers rely on manual, heroic efforts to match, cleanse, and de-duplicate data -- clearly not a scalable, sustainable model. To solve this challenge, organizations need the ability to standardize data for customers, products, sites, suppliers and financial accounts; however, less than 10 percent of respondents have technology in place to automatically resolve duplicates. It is no wonder, therefore, that we get communications about products we don't own, at addresses we don't reside, and using channels (like direct mail) we don't like. An all-too-common example of a potential challenge follows: Customers end up receiving duplicate communications, which not only impacts customer satisfaction, but also incurs additional mailing costs. Cleansing, normalizing, and standardizing data will help address most of these issues. 5. Only 10 percent of respondents have the ability to share data that was mastered in a master data hub. Close to 60 percent of respondents have efforts in place that profile, standardize and cleanse data manually, and the output of these efforts are stored in spreadsheets in various parts of the organization. This valuable information is not easily shared with the rest of the organization and, more importantly, this enriched information cannot be sent back to the source systems so that the data is fixed at the source. A key benefit of a master data management strategy is not only to clean the data, but to also share the data back to the source systems as well as other systems that need the information. Aside from the source systems, another key beneficiary of this data is the business intelligence system. Having clean master data as input to business intelligence systems provides more accurate and enhanced reporting.  Characteristics of Stellar MDM When deciding on the right master data management technology, organizations should look for solutions that have four main characteristics: enterprise-grade MDM performance complete technology that can be rapidly deployed and addresses multiple business issues end-to-end MDM process management with data quality monitoring and assurance pre-built MDM business relevant applications with data stores and workflows These master data management capabilities will aid in moving closer to a best-practice maturity level, delivering tremendous efficiencies and savings as well as revenue growth opportunities as a result of better understanding your customers.  Trevor Naidoo is a senior director in Industry Strategy and Insight at Oracle

    Read the article

  • XP, how to apply security to files, now have simple file sharing and can't access some files from other machines ?

    - by Jules
    For a month or two now I've been using simple file sharing, for several months before that I didn't, then before that I had simple file sharing tuned on. So at the moment I don't have a security tab (on files or folders) or sharing permissions settings there too. As an example, from another machine, I can access files from 2007 but not from the summer of last year in the same folder. I can access all files on that local machine. So I think I just need to re-apply security or permissions somehow? What should I do?

    Read the article

  • Is there a way to add AD LDS users to an AD Domain Group or allow them domain security rights?

    - by Tom
    I have a web application in which our outside customers need access to run transactions (stored procs on Sql Server) on our domain. We have looked into LDS to keep these users separate from our domain. The problem we are having is allowing the LDS users the AD security rights to access these stored procs. For administration purposes we would like to use an AD group for each transaction (stored proc) which has access to execute. Is there a way to add LDS users to this AD group or allow them the security rights to do this? We have setup LDS and can authenicate an AD user thru to runs these transactions. LDS is running on Server 08 R2. AD is also Server 08 R2. Thanks.

    Read the article

  • "The site's security certificate is not trusted!" on every SSL page?

    - by Isaac Waller
    I'm using the latest Chrome dev build on Mac OS X. Recently, I've been getting this message on any HTTPS webpage when I visit it the first time: The site's security certificate is not trusted! You attempted to reach checkout.google.com, but the server presented a certificate issued by an entity that is not trusted by your computer's operating system. This may mean that the server has generated its own security credentials, which Google Chrome cannot rely on for identity information, or an attacker may be trying to intercept your communications. You should not proceed, especially if you have never seen this warning before for this site. Why is this here, and how can I fix it? It may be because of my development build, but many other people use the dev version also, and I expect it would be fixed quicker then this.

    Read the article

  • OraOps10.dll loading problem

    - by Rodnower
    Hello, I have ASP.NET web service built on windows 7 in 32 bit. All dependences of this service compiled in Release mode in x64 bit. Now, I'm installed it on windows 8 64 bit and when I'm access to this service I get error "Could not load OraOps10.dll". I doesn't succeed to find any thing about this problem with oracle client in context of x32-x64 bit incompatibility in internet. Have you any idea? Thank you very much.

    Read the article

  • What is a good layout for a somewhat advanced home network and storage solution?

    - by Shaun
    My home network/storage needs are changing and I am searching for some opinions and starting points on what a good network/storage layout would be that can serve my needs for a few years into the future. I think I have a decent starting point for equipment, but I am also willing to invest fairly heavily in a solution that can last me for a while. I am a bit of a tech nerd and I have a moderate tolerance for setup of the solution. I would prefer if maintenance of the system is somewhat low once it is setup, but I am willing to accept some tradeoffs. Existing equipment: Router - Netgear WNDR3700 (gigabit) Router - DLink Gamerlounge DGL-4300 (gigabit) Switch - 16 port Trendnet green switch (gigabit) Switch - 5 port Trendnet green (gigabit) Computer - i7-950 office computer (gigabit ethernet) Computer - Q6600 quad core media center, hooked up to TV, records shows (gigabit ethernet) Computer - Acer 1810T ultraportable laptop (gigabit and N ethernet) NAS - Intel SS4200-E (gigabit) External hard drive - 2TB WD Green drive (esata) All kinds of miscellaneous network connected TV, Bluray, Verizon network extender, HDhomerun TV tuners, etc. Requirements: -Robust backup solution for a growing collection of huge family picture files and personal files, around 1.5TB. (Including offsite backup) -Central location for all user's files, while also keeping them secure from each other. -Storage for terabytes of movie backups and recorded TV, and access to them from all computers (maybe around 4TB eventually) -Possibility to host files to friends and family easily Nice to have: -Backup of terabytes of movie backups Intriguing possibilities: -Capability to have users' Windows desktops and files look the same from all network computers I am not sure if the new Windows Home Server 2011 would fit into this well, if I need a domain server, how best to organize my backups, or how to most effectively use RAID. Currently I am simply backing up all computers to a RAID 1 on the NAS box, which I was thinking could prevent a situation where I reach for a backup and find that the disk is corrupt. One possibility that I am thinking about now is simply using my media center PC with a huge RAID of hard drives on which all files are stored. Pseudo-backup of all files would be present because of the RAID, but important files would also be backed up off site via carrying hard drives to work. But what if corruption seeps into the files and the corrupted data is then backed up? Does RAID protect against this? I really want to take next to zero risks with the irreplaceable files. I can handle some degree of risk with the movies and other files. I'm looking for critiques on this idea as well as other possibilities. To summarize, my goal is high functionality, media capable, and robust backup of irreplaceable files.

    Read the article

  • What permissions do I need to run SQL*Loader?

    - by Jason Baker
    What permissions does a database user need to be able to run oracle's sql loader? For instance, since sql loader will disable indexes and triggers, does it need ALTER permissions for those items? This seems like a simple question, but I can't find any documentation on this in the manual.

    Read the article

  • ZFS/Btrfs/LVM2-like storage with advanced features on Linux?

    - by Easter Sunshine
    I have 3 identical internal 7200 RPM SATA hard disk drives on a Linux machine. I'm looking for a storage set-up that will give me all of this: Different data sets (filesystems or subtrees) can have different RAID levels so I can choose performance, space overhead, and risk trade-offs differently for different data sets while having a few number of physical disks (very important data can be 3xRAID1, important data can be 3xRAID5, unimportant reproducible data can be 3xRAID0). If each data set has an explicit size or size limit, then the ability to grow and shrink the size limit (offline if need be) Avoid out-of-kernel modules R/W or read-only COW snapshots. If it's a block-level snapshots, the filesystem should be synced and quiesced during a snapshot. Ability to add physical disks and then grow/redistribute RAID1, RAID5, and RAID0 volumes to take advantage of the new spindle and make sure no spindle is hotter than the rest (e.g., in NetApp, growing a RAID-DP raid group by a few disks will not balance the I/O across them without an explicit redistribution) Not required but nice-to-haves: Transparent compression, per-file or subtree. Even better if, like NetApps, analyzes the data first for compressibility and only compresses compressible data Deduplication that doesn't have huge performance penalties or require obscene amounts of memory (NetApp does scheduled deduplication on weekends, which is good) Resistance to silent data corruption like ZFS (this is not required because I have never seen ZFS report any data corruption on these specific disks) Storage tiering, either automatic (based on caching rules) or user-defined rules (yes, I have all-identical disks now but this will let me add a read/write SSD cache in the future). If it's user-defined rules, these rules should have the ability to promote to SSD on a file level and not a block level. Space-efficient packing of small files I tried ZFS on Linux but the limitations were: Upgrading is additional work because the package is in an external repository and is tied to specific kernel versions; it is not integrated with the package manager Write IOPS does not scale with number of devices in a raidz vdev. Cannot add disks to raidz vdevs Cannot have select data on RAID0 to reduce overhead and improve performance without additional physical disks or giving ZFS a single partition of the disks ext4 on LVM2 looks like an option except I can't tell whether I can shrink, extend, and redistribute onto new spindles RAID-type logical volumes (of course, I can experiment with LVM on a bunch of files). As far as I can tell, it doesn't have any of the nice-to-haves so I was wondering if there is something better out there. I did look at LVM dangers and caveats but then again, no system is perfect.

    Read the article

  • Memory Usage for Databases on Linux

    - by Kyle Brandt
    So with free output what we care about with application memory usage is generally the amount of free memory in the -/+ buffers/cache line. What about with database applications such as Oracle, is it important to have a good amount of cached and buffers available for a database to run well with all the IO? If that makes any sense, how do you figure out just how much?

    Read the article

  • notepad++ to search a line for varied strings. advanced commands

    - by Capum
    here it is [email protected], Charles Dawson [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], Queen Roffman [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], Wanda Ximenes [email protected], [email protected], I want to remove the strings before the 'emails' therefore only remains the 'emails at example dot example' seperated by comma. Also, I want to search for any repeated or duplicated terms or words in all text. What command for that purpose would reply me '[email protected]' ?

    Read the article

  • What's the most advanced SIP client for Linux these days?

    - by Stefan Armbruster
    I'm currently struggling which SIP client software to use with respect to Ubuntu / Gnome. Some clients I've looked so far: Blink, seems promising but the Linux variant lacks a lot of features Twinkle Latest release is ~2 years old. AFAIK the only one capable of encrypting calls using zrtp. Empathy: default tool for IM on ubuntu Ekiga Some features I'd like to see: availablity of buddies conference calls call log chat desktop sharing (Blink seems to do that for Mac) So my question is: what client software do you prefer and for what reason?

    Read the article

  • problem getting php_oci8 working on linux RHEL 5

    - by Jonathan
    Hi All, I'm installing oracle oci8 on a linux server here and I am having an issue where php_oci8.so does not seem to be able to find the libclntsh.so.11.1. I've got the instant client installed and it shows up fine in ldconfig -p , but when I do ldd on the php_oci8.so it shows up as not found. Does anyone have any ideas as for what I can check ?

    Read the article

  • OAS log files filling up hard drive

    - by Andrew Hampton
    We've had issues with log files for Oracle Application Server filling up the hard drive on our server. The files are in the /network/admin folder and are named server.log_XXXXX.trc and client.log_XXXXX.trc where XXXXX are 5 digits. The files are typically anywhere from 1-2MB in size but can be up to 100MB and thousands of them are created at a rate of about 5-10 per minute. Does anyone know how to disable these logs? Thanks!

    Read the article

  • How Can I Override the Remote Administrator security policy on Android 2.2 so that I can disable the lock screen?

    - by hagope
    On Android 2.2 Froyo, I added my Corporate exchange email account to the phone, however, the security policy set by the "remote administer" requires that I enter a 4-digit PIN at the lock screen and a maximum 10sec idle. How can I hack my Android, through root access or otherwise, such that I do not need to follow this security policy. I am very annoyed at having to enter the PIN every time I want to use the phone, because I open/close it so often through out the day? Please help...I'm so surprised at how difficult it is to find the answer!

    Read the article

  • What's the most advanced SIP client for Linux these days?

    - by Stefan
    I'm currently struggling which SIP client software to use with respect to Ubuntu / Gnome. Some clients I've looked so far: Blink, seems promising but the Linux variant lacks a lot of features Twinkle Latest release is ~2 years old. AFAIK the only one capable of encrypting calls using zrtp. Empathy: default tool for IM on ubuntu Ekiga Some features I'd like to see: availablity of buddies conference calls call log chat desktop sharing (Blink seems to do that for Mac) So my question is: what client software do you prefer and for what reason?

    Read the article

  • Which is prefered internet security + Antivirus solution for Windows, with good detection rate? [clo

    - by metal gear solid
    Possible Duplicate: Free antivirus solutions for Windows Which is the best internet security + Antivirus solution for Windows? free/opensource or commercial it doesn't matter I need best solution. Is Kaspersky best ? or any other? http://www.kaspersky.com/kaspersky_internet_security Award-winning technologies in Kaspersky Internet Security 2010 protect you from cybercrime and a wide range of IT threats: * Viruses, Trojans, worms and other malware, spyware and adware * Rootkits, bootkits and other complex threats * Identity theft by keyloggers, screen capture malware or phishing scams * Botnets and various illegal methods of taking control of your PC or Netbook * Zero-day attacks, new fast emerging and unknown threats * Drive-by download infections, network attacks and intrusions * Unwanted, offensive web content and spam

    Read the article

< Previous Page | 544 545 546 547 548 549 550 551 552 553 554 555  | Next Page >