Search Results

Search found 4072 results on 163 pages for 'social analytics'.

Page 141/163 | < Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >

  • MongoDB on 128mb 32-bit VPS (plus Tornado and Redis)

    - by apito
    i am curious about how mongodb will perform in a limited vps. specifically, i'll deploy this configuration on 32-bit ubuntu 9.04 server with 128Mb memory (UPDATE: now i'm considering 360mb too). nginx and redis three instances of tornado apps (one is for mobile site; limited app, not my primary audience); has around 8 Collections. social webapp for my community. mongodb all beside mongodb seems to have small footprint. memory-mapping-wise, i dont know how mongodb will behave. i know it's a little bit a stretch to use this kind of config on a tiny vps, but that's what i can afford for now. i expect to have.. hmm.. maybe ~50 15rps. i did my homework doing a lot of frontend optimizations and yslow says grade A 91 (ruleset V2) :-) anyone willing to share experiences? eg. how big the data set size when mongo hit the ceiling, performance when mongo do a lot of disk IO, etc. thanks. UPDATE: this is my pet project. i'll get back to you when i have next spare time to do same httperf in a vbox with exact spec. suggestion how to do stress testing welcomed. i'm new to this kind of stuff.

    Read the article

  • MS Excel 2010 - Using DSN + 32 bit drivers

    - by Kristiaan
    I need some advice as im running into a problem and so far i have been unable to find a solution. We have a set of reports developed in MS Excel that use DSN file to connect to data sources to retrieve data, these work fine on 32 / 64bit systems, however we are moving to a terminal server environment using windows 2008 R2 64Bit. The reports fail to run using the DSN's within this environment if we only have the 32bit drivers installed and configured in the ODBC settings, the minute we install the 64Bit drivers the software works. Is there a way / Method of getting Excel or the DSN file to NOT use the 64Bit driver, but force it to use the 32bit driver. ANSWERED - But due to low user score i cannot "answer" my own question... Sadly there is no way to-do what i want to-do, without a lot of very nasty and not 100% perfect reg hacks. If you need to access 32bit ODBC data sources the application in question has to be 32Bit. here is a link to just one forum post i found relating to this type of problem, it appears the only way i would be able to accomplish this is to remove the 64bit version of office and install the 32bit version instead of it. http://social.msdn.microsoft.com/Forums/en-US/accessdev/thread/5108f337-f06a-4518-afe3-d3c1abd040ef/

    Read the article

  • Know your Data Lineage

    - by Simon Elliston Ball
    An academic paper without the footnotes isn’t an academic paper. Journalists wouldn’t base a news article on facts that they can’t verify. So why would anyone publish reports without being able to say where the data has come from and be confident of its quality, in other words, without knowing its lineage. (sometimes referred to as ‘provenance’ or ‘pedigree’) The number and variety of data sources, both traditional and new, increases inexorably. Data comes clean or dirty, processed or raw, unimpeachable or entirely fabricated. On its journey to our report, from its source, the data can travel through a network of interconnected pipes, passing through numerous distinct systems, each managed by different people. At each point along the pipeline, it can be changed, filtered, aggregated and combined. When the data finally emerges, how can we be sure that it is right? How can we be certain that no part of the data collection was based on incorrect assumptions, that key data points haven’t been left out, or that the sources are good? Even when we’re using data science to give us an approximate or probable answer, we cannot have any confidence in the results without confidence in the data from which it came. You need to know what has been done to your data, where it came from, and who is responsible for each stage of the analysis. This information represents your data lineage; it is your stack-trace. If you’re an analyst, suspicious of a number, it tells you why the number is there and how it got there. If you’re a developer, working on a pipeline, it provides the context you need to track down the bug. If you’re a manager, or an auditor, it lets you know the right things are being done. Lineage tracking is part of good data governance. Most audit and lineage systems require you to buy into their whole structure. If you are using Hadoop for your data storage and processing, then tools like Falcon allow you to track lineage, as long as you are using Falcon to write and run the pipeline. It can mean learning a new way of running your jobs (or using some sort of proxy), and even a distinct way of writing your queries. Other Hadoop tools provide a lot of operational and audit information, spread throughout the many logs produced by Hive, Sqoop, MapReduce and all the various moving parts that make up the eco-system. To get a full picture of what’s going on in your Hadoop system you need to capture both Falcon lineage and the data-exhaust of other tools that Falcon can’t orchestrate. However, the problem is bigger even that that. Often, Hadoop is just one piece in a larger processing workflow. The next step of the challenge is how you bind together the lineage metadata describing what happened before and after Hadoop, where ‘after’ could be  a data analysis environment like R, an application, or even directly into an end-user tool such as Tableau or Excel. One possibility is to push as much as you can of your key analytics into Hadoop, but would you give up the power, and familiarity of your existing tools in return for a reliable way of tracking lineage? Lineage and auditing should work consistently, automatically and quietly, allowing users to access their data with any tool they require to use. The real solution, therefore, is to create a consistent method by which to bring lineage data from these data various disparate sources into the data analysis platform that you use, rather than being forced to use the tool that manages the pipeline for the lineage and a different tool for the data analysis. The key is to keep your logs, keep your audit data, from every source, bring them together and use the data analysis tools to trace the paths from raw data to the answer that data analysis provides.

    Read the article

  • Why does clicking on Windows 7 Printer Properties Result In Driver Not Installed?

    - by octopusgrabbus
    The question I need to ask is has anyone heard of getting a "driver not installed" error when clicking on a printer's properties on Windows 7, and is there a workaround? Here are the details of the problem. One of our users has a Windows 7 desktop, and an HP LaserJet 4050 T connected to via a parallel-to-usb converter. The PLC5 universal driver was installed for series 4050 printers. I needed to install the PLC 6 driver, which completed successfully. The user is an administrator of the system, and I was prompted to and accepted running as Administrator to install the driver. After the install, I went to see the 4050's properties and was prompted that the PLC6 driver was not installed. I believe the PLC6 driver was installed, because the PLC5 driver resulted in receiving an official HP error page indicating the printer was "not set up for collating" as the second page of printing two copies of a one page email. This problem did not occur with the PLC 6 driver. Oddly enough, setting back to PLC5 produced the same error about the PLC5 driver not being installed. I ignored/dismissed the error box (did not re-install the driver), and reproduced the error, with the second page being the HP not set up for collating error page. Any thoughts on what is causing this and how to clear it would be appreciated. The closest fix I could find was on a Microsoft tech page, and they had me clear winsock out of a Administrator run command line, followed by a reboot. That did not fix the problem. I have also found this http://social.technet.microsoft.com/Forums/windowsserver/en-US/5101195b-3aca-4699-9a06-db4578614e2d/changing-driver-results-in-printer-driver-is-not-installed-error-on-server-2008?forum=winserverprint and will look into trying some of these suggestions, which appear to me to be a "shotgun" approach to fixing the problem.

    Read the article

  • setup lowcost image storage server with 24x SSD array to get high IOPS?

    - by Nenad
    I want to build let's name it a lowcost Ra*san which would host for our social site the images (many millions) we have 5 sizes of every photo with 3 KB, 7 KB, 15 KB, 25 KB and 80 KB per Image. My idea is to build a Server with 24x consumer 240 GB SSD's in Raid 6 which will give me some 5 TB Disk space for the photo storage. To have HA I can add a 2nd one and use drdb. I'm looking to get above 150'000 IOPS (4K Random reads). As we mostly have read access only and rarely delete photos i think to go with consumer MLC SSD. I read many endurance reviews and don't see there a problem as long we don't rewrite the cells. What you think about my idea? - I'm not sure between Raid 6 or Raid 10 (more IOPS, cost SSD). - Is ext4 OK for the filesystem - Would you use 1 or 2 Raid controller, with Extender Backplane If anyone has realized something similar i would be happy to get Real World numbers. UPDATE I have buy 12 (plus some spare) OCZ Talos 480GB SAS SSD Drive's they will be placed in a 12-bay DAS and attached to a PERC H800 (1GB NV Cache, manufactured by LSI with fastpath) Controller, I plan to setup Raid 50 with ext4. If someone is wondering about some benchmarks let me know what you would like to see.

    Read the article

  • Hotmail mail delivery issue (spam)

    - by chaochito
    Hello, I am running a Postfix server in a dedicated server in a Linux environment (centOS 5.3) for a social networking web application and are experiencing deliverability issues with Hotmail (I can send mails to Gmail, Yahoo, Aol in inbox). I only send legit mails for registered users (notifications). I have SPF, DK and DKIM setup. I pass the Sender ID test when mailing to [email protected] but we have "X-Auth-Result : None" only in Hotmail headers and no X-SID-Result:Pass. We have been enrolled in their program for more than 2 weeks and normally when you apply to their Sender ID program you are supposed to have X-SID-Result:Pass and X-Auth-Result:Pass. I contacted Hotmail about the issue and they told me that my domain looks like added to Sender ID in their system this is beyond their support and asked me to contact my ISP. As you can imagine, my ISP has no clue about that either. I don't really know what could be wrong... Mails are currently filtered as spam and we would like to be able to have them landing in inbox.

    Read the article

  • Web Safe Area (optimal resolution) for web app design?

    - by M.A.X
    I'm in the process of designing a new web app and I'm wondering for what 'Web Safe Area' should I optimize the app layout and design. By Web Safe Area I mean the actual area available to display the website in the browser (which is influenced by monitor resolution as well as the space taken up by the browser and OS) I did some investigation and thinking on my own but wanted to share this to see what the general opinion is. Here is what I found: Optimal Display Resolution: w3schools web stats seems to be the most referenced source (however they state that these are results from their site and is biased towards tech savvy users) http://www.w3counter.com/globalstats.php (aggregate data from something like 15,000 different sites that use their tracking services) StatCounter Global Stats Display Resolution (Stats are based on aggregate data collected by StatCounter on a sample exceeding 15 billion pageviews per month collected from across the StatCounter network of more than 3 million websites) NetMarketShare Screen Resolutions (marketshare.hitslink.com) (a web analytics consulting firm, they get data from browsers of site visitors to their on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month) Display Resolution Summary: There is a bit of variation between the above sources but in general as of Jan 2011 looks like 1024x768 is about 20%, while ~85% have a higher resolution of at least 1280x768 (1280x800 is the most common of these with 15-20% of total web, depending on the source; 1280x1024 and 1366x768 follow behind with 9-14% of the share). My guess would be that the higher resolution values will be even more common if we filter on North America, and even higher if we filter on N.American corporate users (unfortunately I couldn't find any free geographically filtered statistics). Another point to note is that the 1024x768 desktop user population is likely lower than the aforementioned 20%, seeing as the iPad (1024x768 native display) is likely propping up those number (the app I'm designing is flash based, Apple mobile devices don't support flash so iPad support isn't a concern). My recommendation would be to optimize around the 1280x768 constraint (*note: 1280x768 is actually a relatively rare resolution, but I think it's a valid constraint range considering that 1366x768 is relatively common and 1280 is the most common horizontal resolution). Browser + OS Constraints: To further add to the constraints we have to subtract the space taken up by the browser (assuming IE, which is the most space consuming) and the OS (assuming WinXP-Win7): Win7 has the biggest taskbar footprint at a height of 40px (XP's and Vista's is 30px) The default IE8 view uses up 25px at the bottom of the screen with the status bar and a further 120px at the top of the screen with the windows title bar and the browser UI (assuming the default 'favorites' toolbar is present, it would instead be 91px without the favorites toolbar). Assuming no scrollbar, we also loose a total of 4px horizontally for the window outline. This means that we are left with 583px of vertical space and 1276px of horizontal. In other words, a Web Safe Area of 1276 x 583 Is this a correct line of thinking? I'm really surprised that I couldn't find this type of investigation anywhere on the web. Lots of websites talk about designing for 1024x768, but that's only half the equation! There is no mention of browser/OS influences on the actual area you have to display the site/app. Any help on this would be greatly appreciated! Thanks. EDIT Another caveat to my line of thinking above is that different browsers actually take up different amounts of pixels based on the OS they're running on. For example, under WinXP IE8 takes up 142px on top of the screen (instead the aforementioned 120px for Win7) because the file menu shows up by default on XP while in Win7 the file menu is hidden by default. So it looks like on WinXP + IE8 the Web Safe Area would be a mere 572px (768px-142-30-24=572)

    Read the article

  • Investigating on xVelocity (VertiPaq) column size

    - by Marco Russo (SQLBI)
      In January I published an article about how to optimize high cardinality columns in VertiPaq. In the meantime, VertiPaq has been rebranded to xVelocity: the official name is now “xVelocity in-memory analytics engine (VertiPaq)” but using xVelocity and VertiPaq when we talk about Analysis Services has the same meaning. In this post I’ll show how to investigate on columns size of an existing Tabular database so that you can find the most important columns to be optimized. A first approach can be looking in the DataDir of Analysis Services and look for the folder containing the database. Then, look for the biggest files in all subfolders and you will find the name of a file that contains the name of the most expensive column. However, this heuristic process is not very optimized. A better approach is using a DMV that provides the exact information. For example, by using the following query (open SSMS, open an MDX query on the database you are interested to and execute it) you will see all database objects sorted by used size in a descending way. SELECT * FROM $SYSTEM.DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS ORDER BY used_size DESC You can look at the first rows in order to understand what are the most expensive columns in your tabular model. The interesting data provided are: TABLE_ID: it is the name of the object – it can be also a dictionary or an index COLUMN_ID: it is the column name the object belongs to – you can also see ID_TO_POS and POS_TO_ID in case they refer to internal indexes RECORDS_COUNT: it is the number of rows in the column USED_SIZE: it is the used memory for the object By looking at the ration between USED_SIZE and RECORDS_COUNT you can understand what you can do in order to optimize your tabular model. Your options are: Remove the column. Yes, if it contains data you will never use in a query, simply remove the column from the tabular model Change granularity. If you are tracking time and you included milliseconds but seconds would be enough, round the data source column to the nearest second. If you have a floating point number but two decimals are good enough (i.e. the temperature), round the number to the nearest decimal is relevant to you. Split the column. Create two or more columns that have to be combined together in order to produce the original value. This technique is described in VertiPaq optimization article. Sort the table by that column. When you read the data source, you might consider sorting data by this column, so that the compression will be more efficient. However, this technique works better on columns that don’t have too many distinct values and you will probably move the problem to another column. Sorting data starting from the lower density columns (those with a few number of distinct values) and going to higher density columns (those with high cardinality) is the technique that provides the best compression ratio. After the optimization you should be able to reduce the used size and improve the count/size ration you measured before. If you are interested in a longer discussion about internal storage in VertiPaq and you want understand why this approach can save you space (and time), you can attend my 24 Hours of PASS session “VertiPaq Under the Hood” on March 21 at 08:00 GMT.

    Read the article

  • Fast Data - Big Data's achilles heel

    - by thegreeneman
    At OOW 2013 in Mark Hurd and Thomas Kurian's keynote, they discussed Oracle's Fast Data software solution stack and discussed a number of customers deploying Oracle's Big Data / Fast Data solutions and in particular Oracle's NoSQL Database.  Since that time, there have been a large number of request seeking clarification on how the Fast Data software stack works together to deliver on the promise of real-time Big Data solutions.   Fast Data is a software solution stack that deals with one aspect of Big Data, high velocity.   The software in the Fast Data solution stack involves 3 key pieces and their integration:  Oracle Event Processing, Oracle Coherence, Oracle NoSQL Database.   All three of these technologies address a high throughput, low latency data management requirement.   Oracle Event Processing enables continuous query to filter the Big Data fire hose, enable intelligent chained events to real-time service invocation and augments the data stream to provide Big Data enrichment. Extended SQL syntax allows the definition of sliding windows of time to allow SQL statements to look for triggers on events like breach of weighted moving average on a real-time data stream.    Oracle Coherence is a distributed, grid caching solution which is used to provide very low latency access to cached data when the data is too big to fit into a single process, so it is spread around in a grid architecture to provide memory latency speed access.  It also has some special capabilities to deploy remote behavioral execution for "near data" processing.   The Oracle NoSQL Database is designed to ingest simple key-value data at a controlled throughput rate while providing data redundancy in a cluster to facilitate highly concurrent low latency reads.  For example, when large sensor networks are generating data that need to be captured while analysts are simultaneously extracting the data using range based queries for upstream analytics.  Another example might be storing cookies from user web sessions for ultra low latency user profile management, also leveraging that data using holistic MapReduce operations with your Hadoop cluster to do segmented site analysis.  Understand how NoSQL plays a critical role in Big Data capture and enrichment while simultaneously providing a low latency and scalable data management infrastructure thru clustered, always on, parallel processing in a shared nothing architecture. Learn how easily a NoSQL cluster can be deployed to provide essential services in industry specific Fast Data solutions. See these technologies work together in a demonstration highlighting the salient features of these Fast Data enabling technologies in a location based personalization service. The question then becomes how do these things work together to deliver an end to end Fast Data solution.  The answer is that while different applications will exhibit unique requirements that may drive the need for one or the other of these technologies, often when it comes to Big Data you may need to use them together.   You may have the need for the memory latencies of the Coherence cache, but just have too much data to cache, so you use a combination of Coherence and Oracle NoSQL to handle extreme speed cache overflow and retrieval.   Here is a great reference to how these two technologies are integrated and work together.  Coherence & Oracle NoSQL Database.   On the stream processing side, it is similar as with the Coherence case.  As your sliding windows get larger, holding all the data in the stream can become difficult and out of band data may need to be offloaded into persistent storage.  OEP needs an extreme speed database like Oracle NoSQL Database to help it continue to perform for the real time loop while dealing with persistent spill in the data stream.  Here is a great resource to learn more about how OEP and Oracle NoSQL Database are integrated and work together.  OEP & Oracle NoSQL Database.

    Read the article

  • Consumer Oriented Search In Oracle Endeca Information Discovery - Part 2

    - by Bob Zurek
    As discussed in my last blog posting on this topic, Information Discovery, a core capability of the Oracle Endeca Information Discovery solution enables businesses to search, discover and navigate through a wide variety of big data including structured, unstructured and semi-structured data. With search as a core advanced capabilities of our product it is important to understand some of the key differences and capabilities in the underlying data store of Oracle Endeca Information Discovery and that is our Endeca Server. In the last post on this subject, we talked about Exploratory Search capabilities along with support for cascading relevance. Additional search capabilities in the Endeca Server, which differentiate from simple keyword based "search boxes" in other Information Discovery products also include: The Endeca Server Supports Set Search.  The Endeca Server is organized around set retrieval, which means that it looks at groups of results (all the documents that match a search), as well as the relationship of each individual result to the set. Other approaches only compute the relevance of a document by comparing the document to the search query – not by comparing the document to all the others. For example, a search for “U.S.” in another approach might match to the title of a document and get a high ranking. But what if it were a collection of government documents in which “U.S.” appeared in many titles, making that clue less meaningful? A set analysis would reveal this and be used to adjust relevance accordingly. The Endeca Server Supports Second-Order Relvance. Unlike simple search interfaces in traditional BI tools, which provide limited relevance ranking, such as a list of results based on key word matching, Endeca enables users to determine the most salient terms to divide up the result. Determining this second-order relevance is the key to providing effective guidance. Support for Queries and Filters. Search is the most common query type, but hardly complete, and users need to express a wide range of queries. Oracle Endeca Information Discovery also includes navigation, interactive visualizations, analytics, range filters, geospatial filters, and other query types that are more commonly associated with BI tools. Unlike other approaches, these queries operate across structured, semi-structured and unstructured content stored in the Endeca Server. Furthermore, this set is easily extensible because the core engine allows for pluggable features to be added. Like a search engine, queries are answered with a results list, ranked to put the most likely matches first. Unlike “black box” relevance solutions, which generalize one strategy for everyone, we believe that optimal relevance strategies vary across domains. Therefore, it provides line-of-business owners with a set of relevance modules that let them tune the best results based on their content. The Endeca Server query result sets are summarized, which gives users guidance on how to refine and explore further. Summaries include Guided Navigation® (a form of faceted search), maps, charts, graphs, tag clouds, concept clusters, and clarification dialogs. Users don’t explicitly ask for these summaries; Oracle Endeca Information Discovery analytic applications provide the right ones, based on configurable controls and rules. For example, the analytic application might guide a procurement agent filtering for in-stock parts by visualizing the results on a map and calculating their average fulfillment time. Furthermore, the user can interact with summaries and filters without resorting to writing complex SQL queries. The user can simply just click to add filters. Within Oracle Endeca Information Discovery, all parts of the summaries are clickable and searchable. We are living in a search driven society where business users really seem to enjoy entering information into a search box. We do this everyday as consumers and therefore, we have gotten used to looking for that box. However, the key to getting the right results is to guide that user in a way that provides additional Discovery, beyond what they may have anticipated. This is why these important and advanced features of search inside the Endeca Server have been so important. They have helped to guide our great customers to success. 

    Read the article

  • Web Hosting: Any web host that supports files more than 50,000 in number?

    - by Devner
    Hi all, For my PHP & mySQL based application, I am trying to buy website hosting from a host who does not have a limit on the number of files I carry in my hosting account. Almost all the websites have a common limit of 50,000 files (some websites call it 50,000 nodes). The rest(to the extent of my search) are not even close. I have gone through the various websites, Googled lot of information, have spoken with the customer service of the hosting companies and they said that they have a limit of 50,000 files and that's why they call it the LIMIT. Now I have my application, which is a kind of social networking website, where people can upload various files of varying file size. So say if 50,000 users were to join the website and upload 1 file each, the limit of 50,000 will be reached very easily and my 50,001 customer will start facing file upload problems (& so will my account). So I would like to know if there's any website hosting services that do NOT levy such restrictions. In summary, I need the following options: No maximum file limit (more than 50,000 files in account). No maximum file upload limit in server setting (10MB, 12MB, 15MB, 20MB, etc.). Ability to upload files of various types (zip, flv, jg, png, etc.). Ability to stream Audio and Video (live audio & video not necessary). Access to .htaccess Access to php.ini, my.cnf or my.ini (this would be a plus) Supports SSL. Provides dedicated hosting(& IP) as well. Monthly payments without contracts are a plus. If you know of any such website hosting services, please post a reply ( a link to the same will be appreciated ). Thank you.

    Read the article

  • Internet of Things Becoming Reality

    - by kristin.jellison
    The Internet of Things is not just on the radar—it’s becoming a reality. A globally connected continuum of devices and objects will unleash untold possibilities for businesses and the people they touch. But the “things” are only a small part of a much larger, integrated architecture. A great example of this comes from the healthcare industry. Imagine an expectant mother who needs to watch her blood pressure. She lives in a mountain village 100 miles away from medical attention. Luckily, she can use a small “wearable” device to monitor her status and wirelessly transmit the information to a healthcare hub in her village. Now, say the healthcare hub identifies that the expectant mother’s blood pressure is dangerously high. It sends a real-time alert to the patient’s wearable device, advising her to contact her doctor. It also pushes an alert with the patient’s historical data to the doctor’s tablet PC. He inserts a smart security card into the tablet to verify his identity. This ensures that only the right people have access to the patient’s data. Then, comparing the new data with the patient’s medical history, the doctor decides she needs urgent medical attention. GPS tracking devices on ambulances in the field identify and dispatch the closest one available. An alert also goes to the closest hospital with the necessary facilities. It sends real-time information on her condition directly from the ambulance. So when she arrives, they already have a treatment plan in place to ensure she gets the right care. The Internet of Things makes a huge difference for the patient. She receives personalized and responsive healthcare. But this technology also helps the businesses involved. The healthcare provider achieves a competitive advantage in its services. The hospital benefits from cost savings through more accurate treatment and better application of services. All of this, in turn, translates into savings on insurance claims. This is an ideal scenario for the Internet of Things—when all the devices integrate easily and when the relevant organizations have all the right systems in place. But in reality, that can be difficult to achieve. Core design principles are required to make the whole system work. Open standards allow these systems to talk to each other. Integrated security protects personal, financial, commercial and regulatory information. A reliable and highly available systems infrastructure is necessary to keep these systems running 24/7. If this system were just made up of separate components, it would be prohibitively complex and expensive for almost any organization. The solution is integration, and Oracle is leading the way. We’re developing converged solutions, not just from device to datacenter, but across devices, utilizing the Java platform, and through data acquisition and management, integration, analytics, security and decision-making. The Internet of Things (IoT) requires the predictable action and interaction of a potentially endless number of components. It’s in that convergence that the true value of the Internet of Things emerges. Partners who take the comprehensive view and choose to engage with the Internet of Things as a fully integrated platform stand to gain the most from the Internet of Things’ many opportunities. To discover what else Oracle is doing to connect the world, read about Oracle’s Internet of Things Platform. Learn how you can get involved as a partner by checking out the Oracle Java Knowledge Zone. Best regards, David Hicks

    Read the article

  • How to monitor bandwidth use of each device on wifi network

    - by GWLlosa
    I have in my home a standard Comcast cable internet connection. I have it going from the wall to a cable modem, and from the modem to a late-series Linksys router, which provides wired and wireless networking. The vast majority of the users are wireless connections. For day-to-day tasks, this connection is fully sufficient for all my needs. However, on regular occassions, we have social gatherings that involve many people bringing laptops and other PCs and using the network and internet simultaneously, frequently for gaming. I have no administrative oversight over these machines; they have been known to be riddled with spyware and/or bloatware or be running torrents, legal or otherwise. The only reason I care is that on a regular basis, one of the machines will flatline my internet bandwith, and consume it all in order to upload/download/spam people/whatever. When this happens, the latency of the connections for gaming and the like becomes unacceptable, and everyone suffers. My question is: Is there a system I can set up whereby I can easily monitor the various systems connected to my wireless connection, see how much bandwith each one is using, and for what ends? That way, at a glance, I can spot the offending machine and kick it from the connection, without having to go from machine to machine, checking each one's "bandwith used" properties manually, and dealing with the owner's indignant protests all the while. I understand this will likely involve 3rd-party software and/or hardware; my issue is I don't even know where to begin.

    Read the article

  • How To Monitor Home Wireless Network Connected Devices Bandwith

    - by GWLlosa
    (Originally posted on SuperUser, not sure if it might be better suited here) I have in my home a standard Comcast cable internet connection. I have it going from the wall to a cable modem, and from the modem to a late-series Linksys router, which provides wired and wireless networking. The vast majority of the users are wireless connections. For day-to-day tasks, this connection is fully sufficient for all my needs. However, on regular occassions, we have social gatherings that involve many people bringing laptops and other PCs and using the network and internet simultaneously, frequently for gaming. I have no administrative oversight over these machines; they have been known to be riddled with spyware and/or bloatware or be running torrents, legal or otherwise. The only reason I care is that on a regular basis, one of the machines will flatline my internet bandwith, and consume it all in order to upload/download/spam people/whatever. When this happens, the latency of the connections for gaming and the like becomes unacceptable, and everyone suffers. My question is: Is there a system I can set up whereby I can easily monitor the various systems connected to my wireless connection, see how much bandwith each one is using, and for what ends? That way, at a glance, I can spot the offending machine and kick it from the connection, without having to go from machine to machine, checking each one's "bandwith used" properties manually, and dealing with the owner's indignant protests all the while. I understand this will likely involve 3rd-party software and/or hardware; my issue is I don't even know where to begin.

    Read the article

  • Windows: disable remote access of local drive, even by domain admin

    - by Matt
    We have a network of Windows 7 PCs that are managed as part of a domain. What we want is for the domain admin to be unable to view the PC's local drive (C:) unless he is physically at the PC. In other words, no remote desktop and no ability to use UNC. In other words, the domain admin should not be allowed to put \\user_pc\c$ in Windows Explorer and see all the files on that computer, unless he is physically present at the PC itself. Edit: to clarify some of the questions/comments that have come up. Yes, I am an admin---but a complete Windows novice. And yes, for the sake of this and my similar questions, it is fair to assume that I am working for someone who is paranoid. I understand the arguments about this being a "social problem versus a technical problem", and "you should be able to trust your admins", etc. But this is the situation in which I find myself. I'm basically new to Windows system administration, but am tasked with creating an environment that is secure by the company owner's definition---and this definition is clearly very different from what most people expect. In short, I understand that this is an unusual request. But I'm hoping there is enough expertise in the ServerFault community to point me in the right direction.

    Read the article

  • Are You a WebCenter Innovator?

    - by Michael Snow
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Calling all Oracle WebCenter Innovators: Submit your Nomination for the 2012 Innovation Awards Click here, to submit your nomination today Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Call for Nominations: Oracle Fusion Middleware Innovation Awards 2012 Are you doing something unique and innovative with Oracle Fusion Middleware? Submit a nomination today for the Oracle Fusion Middleware Innovation Awards. Winners receive a free pass to Oracle OpenWorld 2012 in San Francisco (September 30 - October 4th) and will be honored during a special event at OpenWorld. Categories include: Oracle Exalogic Cloud Application Foundation Service Integration (SOA) and BPM WebCenter Identity Management Data Integration Application Development Framework and Fusion Development Business Analytics (BI, EPM and Exalytics) To be considered for this award, complete the Oracle Fusion Middleware Innovation Awards nomination form and send to [email protected]. The deadline to submit a nomination is 5pm Pacific on July 17, 2012.

    Read the article

  • CFOs: Do You Have a Playbook for Growth?

    - by Oracle Accelerate for Midsize Companies
    by Jim Lein, Oracle Midsize Programs In most global markets, CFOs are optimistic about their company's growth opportunities. Deloitte's CFO Signals Report, "Time to Accelerate" found that: In the U.K. business optimism is at its highest level in three-and-a-half years Optimism in North America rose from a strong +42% last quarter (Q2 to Q3 2013) to an even stronger +54%. The inaugural Southeast Asia survey, 44% of CFOs reported a positive outlook despite worries over the Chinese economy and political uncertainty. Sustainable and profitable business growth doesn't usually happen by accident. Company's need a playbook for growth that's owned by the CFO. And today, that playbook must leverage the six enabling technologies--Social, Big Data, Mobile, Cloud, Analytics, and The Internet of Things (or, as Oracle president Mark Hurd explains, "The Internet of the People"). On Monday June 9 at  2:00 pm Eastern, CFO.com is hosting a webcast, "The CFO Playbook on Growth: How CFOs Can Boost Efficiency and Performance with Automation". Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} “Investing in technology begins with a business metric driven business case with clear tangible business results expected," says John Lieblang, Affiliate Partner with Waterstone Management Group. "The progressive CFO has learned how to forge a partnership with the CIO to align everyone in the 'result value chain' to be accountable for the business results not just for functional technology.” Click HERE to register  Looking for more news and information about Oracle Solutions for Midsize Companies? Read the latest Oracle for Midsize Companies Newsletter Sign-up to receive the latest communications from Oracle’s industry leaders and experts Jim Lein I evangelize Oracle's enterprise solutions for growing midsize companies. I recently celebrated 15 years with Oracle, having joined JD Edwards in 1999. I'm based in Evergreen, Colorado and love relating stories about creativity and innovation whether they be about software, live music, or the mountains. The views expressed here are my own, and not necessarily those of Oracle.

    Read the article

  • Some Apps don't start on Windows 8 Release Preview

    - by Exa
    I recently installed the Release Preview of Windows 8 in a virtual machine. Some apps do not work. When I open them (by clicking on their tile in the start screen) I see a splash screen and nothing else happens. Sometimes the app crashes after 30 seconds, sometimes it just keeps on loading. A good example is the "Map"-App from Windows 8 or the app "Cookbook" by Bewise. I installed Cookbook and when I had a look at the task manager I saw that it was the 32bit version running, but I have an x64 Windows 8... Could this be a problem? Shouldn't the Windows Store download the correct version? This is the setup of my virtual machine: Windows 8 Release Preview x64 Oracle VirtualBox 4 of 8 cores from host system 8 of 16 GB RAM from the host system 256 MB graphics memory guest additions installed resolution 1920 x 1080 Do you need further information? Unfortunately there is no error message... I just see the start screen of the app with its logo and it keeps loading, but nothing happens. Other Apps (like Mail, Video, Social, etc.) work fine.

    Read the article

  • 'The rpc server is unavailable' or 'access is denied' error when using Remote desktop Services Manager on Windows 7 (but mstsc.exe works!)

    - by tbone
    I am trying to connect to a Windows XP workstation from a Windows 7 Ultimate workstation using Remote Desktop Services Manager. I am able to do a Remote Desktop (mstsc.exe) session from the Win7 machine to the WinXP machine with no problem at all. When running the Remote Desktops Admin (tsmmc.msc) too on a Windows XP box, I can also connect with no problem. However, when I use the new Remote Desktop Services Manager on Windows 7 and try to connect, I get the error: "The rpc server is unavailable" What could cause this? Has there been some fundamental change in Remote Desktop Services Manager, does it connect in a different way somehow? Update #1 Turned off firewall on the Windows XP box and the "The rpc server is unavailable" error went away; so RDSM seems to be using an entirely new port/connection/service compared to mstsc.exe or the old Remote Desktops Admin tool. Now... after disabling the firewall, I get a new error: Access is Denied. After doing some googling, I found some articles discussing this; basically, the error is very misleading - the actual problem is, if either side of the connection has dual monitors, and they are not both Win7 Ultimate, then you cannot connect using Remote Desktop Services Manager...the reason is, by default it uses the /multimon switch, and this switch requires a certain level of Windows license - and, there seems to be no way of changing this default (if anyone knows of a way to change this default, please post an answer or comment!). Nice going Microsoft. http://social.technet.microsoft.com/Forums/en-US/windowsserver2008r2rds/thread/4d06278f-e0f4-4f8e-a8e1-3697ee967ef4 http://www.experts-exchange.com/OS/Microsoft_Operating_Systems/Windows/Windows_7/Q_26225743.html

    Read the article

  • OBI10g will be Non-Qualifying for OPN Specialisation in September 2013

    - by Mike.Hallett(at)Oracle-BI&EPM
    aboteste 12.00 Just to alert you, that OPN Specialisation is software version-specific: so as we bring out new product versions, so the specialisation crtiteria and exam needs to keep updated to the new version. Specifically for our analytics partners, be aware that your “OBI10g Specialisation” credentials will be Non-Qualifying for OPN Specialisation in September 2013. Therefore to keep your profile up to date please start now to take the latest OBI11g exams and apply to OPN to upgrade to “OBI11g Specialisation”. Find out more about the new version OBI11g Exams to update your OPN Specialisation @ OPN Exam for OBI Suite 11g is Now LIVE, and more generally the Update on BI & EPM Specializations and Knowledge Zones. aboteste 12.00 Normal 0 false false false EN-GB X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-fareast-language:EN-US;} And to help you pass, see the Oracle Business Intelligence Foundation Suite 11 Essentials Exam Study Guide. Also check the documents on OPN for Frequently Asked Questions for Product Version Specialization and the latest Specializations Catalog. Normal 0 false false false EN-GB X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-fareast-language:EN-US;}

    Read the article

  • College Ratings via the Federal Government

    - by user9147039
    A few weeks back you might remember news about a higher education rating system proposal from the Obama administration. As I've discussed previously, political and stakeholder pressures to improve outcomes and increase transparency are stronger than ever before. The executive branch proposal is intended to make progress in this area. Quoting from the proposal itself, "The ratings will be based upon such measures as: Access, such as percentage of students receiving Pell grants; Affordability, such as average tuition, scholarships, and loan debt; and Outcomes, such as graduation and transfer rates, graduate earnings, and advanced degrees of college graduates.” This is going to be quite complex, to say the least. Most notably, higher ed is not monolithic. From community and other 2-year colleges, to small private 4-year, to professional schools, to large public research institutions…the many walks of higher ed life are, well, many. Designing a ratings system that doesn't wind up with lots of unintended consequences and collateral damage will be difficult. At best you would end up potentially tarnishing the reputation of certain institutions that were actually performing well against the metrics and outcome measures that make sense in their "context" of education. At worst you could spend a lot of time and resources designing a system that would lose credibility with its "customers". A lot of institutions I work with already have in place systems like the one described above. They are tracking completion rates, completion timeframes, transfers to other institutions, job placement, and salary information. As I talk to these institutions there are several constants worth noting: • Deciding on which metrics to measure is complicated. While employment and salary data are relatively easy to track, qualitative measures are more difficult. How do you quantify the benefit to someone who studies in one field that may not compensate him or her as well as another field but that provides huge personal fulfillment and reward is a difficult measure to quantify? • The data is available but the systems to transform the data into actual information that can be used in meaningful ways are not. Too often in higher ed information is siloed. As such, much of the data that need to be a part of a comprehensive system sit in multiple organizations, oftentimes outside the reach of core IT. • Politics and culture are big barriers. One of the areas that my team and I spend a lot of time talking about with higher ed institutions all over the world is the imperative to optimize for student success. This, like the tracking of the students’ achievement after graduation, requires a level or organizational capacity that does not currently exist. The primary barrier is the culture of "data islands" in higher ed, and the need for leadership to drive out the divisions between departments, schools, colleges, etc. and institute academy-wide analytics and data stewardship initiatives that will enable student success. • Data quality is a very big issue. So many disparate systems exist (some on premise, some "in the cloud") that keep data about "persons" using different means to identify them. Establishing a single source of truth about an individual and his or her data is difficult without some type of data quality policy and tools. Good tools actually exist but are seldom leveraged. Don't misunderstand - I think it's a great idea to drive additional transparency and accountability into the system of higher education. And not just at home, but globally. Students and parents need access to key data to make informed, responsible choices. The tools exist to not only enable this kind of information to be shared but to capture the very metrics stakeholders care most about and in a way that makes sense in the context of a given institution's "place" in the overall higher ed panoply.

    Read the article

  • What to do with old laptop screens?

    - by Lord Torgamus
    This question is inspired by another SU question I came across earlier today: What to do with old hard drives? It made me think about two long-dead laptops I have with perfectly good screens still inside. One is a Dell Inspiron 5100 and the other is an Averatec E1200, but responses need not be geared towards those particular models' screens. Rules, based heavily on the original question's: Objectives and suggestions to keep in mind when you post an answer : Should showcase your geekiness, be plain ol' fun, serve a social purpose or benefit the community. Your answer need not be limited to only one screen. For a really good answer, I'll go out and buy additional leftover screens. Your answer need not be limited to one project per screen. If additional accessories need be purchased, make sure they are common. Don't tell me to get a moon rock or something. The projects you suggested should serve a useful purpose; art is nice, but functional art is way better. Thanks in advance, folks. EDIT: Found another related question. Fun projects to do with an old 17" LCD monitor EDIT 2: I, for one, am enjoying the new outpouring of creativity here. Best fifty bucks... I mean, rep points... I ever spent. EDIT 3: That does it. At the end of the week, there was a tie for most votes between the accepted answer and the game platform answer. The game platform answer was cooler, but less reasonable as a project to actually do; in other words, it was more moon rocky. Unfortunately, I think fencepost had the best comment on the topic, which is that displays on their own have no good interface. Thanks for playing, everyone!

    Read the article

  • What is the correct approach i should use for an application that requires amazon S3 uploads and SimpleDB data management?

    - by Luis Oscar
    I am developing an application for iOS and that is going smoothly, the problem is that I am very new at server sided things. I am totally confused about how to correctly use Amazon Web Services for this purpose. What I want to do is very simple. I want my application to be able to query a servlet hosted in EC2 to be able to retrieve pictures and data based on some criteria from S3 and SImpleDB respectively. Also the application should be able to upload pictures into a S3 bucket and register the information in the SImpleDB. My main concerns are security and costs, So far i was using Amazon Token Vending Machine but I haven't been successful when trying to customize it, and while researching I discovered that on the long run it is very expensive. The ultimate goal is to handle a "social" picture service for my iOS application. Being able to register new users, authenticate these users. See what permissions they have to which pictures from the bucked. And all this without having to worry about Third party people from accessing the private pictures of my users. Sorry for this question but I am really clueless about how to handle this... I have tried reading many articles but all these server stuff looks very scary.

    Read the article

  • SEO/Google: How should I handle multiple countries and domains?

    - by Valorized
    Hello. I'm the webmaster of an online shop based in Austria (Europe). Therefore we registered "example.at". We also own different other domain names like "example-shop.com" and "example.info". Currently all those domains are redirected (301) to the .at one. Still available is: "example.net" and "example.org" (and .ws/.cc), unfortunately not available: .de/.eu The .com is currently owned by one of our partners, the contract ends in 2012 but until then we have no chance to get this one. Recently I read more about geo-targeting and I noticed ONE big deal. The tld ".at" is hardly recognised in Germany (google.de) whereas it is excellently listed in Austria (google.at). As a result of the .at I cannot set the target location manually (or to unlisted). More info: https://www.google.com/support/webmasters/bin/answer.py?answer=62399&hl=en This is a big problem. I looked at Google Analytics and - although Germany is 10x as big as Austria - there are more visits from Austria. So, how should I config the domain in order to get the best results in both, Germany and Austria? I thought of some solutions: First I could stop redirecting the .info. Then there would be a duplicate of the .at one. Moreover, in Webmastertools, I could set the target location of the .info to Germany. As the .at still targets Austria, both would be targeted - however I don't now if google punishes one of them because of the duplicate content? Same as 1. but with .net or .org (I think .info is not a "nice" domain and moreover I think search engines prefer .com, .net or .org to .info). Same as 1. (or 2.) but with a rel="canonical" on the new one (pointing to the .at). Con: I don't think this will improve the situation, because it still tells google that the .at one is more important, like: "if .info points to .at, the target may still be Austria". rel="canonical" on the .at pointing to the new (.info or .net or .org). However I fear that this will have a negative impact on the listing on google.at because: "Hey, the well-known .at is not important anymore, so let's focus on the .info which is not well-known." - Therefore: bad position in search results. Redirect .at to the new (.info or .net or .org) with a 301-Redirect. Con: Might be worse than 4, we might loose Page-Rank (or "the value of the page", because google says that page rank is not important anymore). Moreover this might be even more confusing for the customers. In 3. or 4. customers don't get redirected, they do not see the canonical-meta-tag. So, dear experts, please tell me what the best option would be! Thank you very much for your advice in advance and please excuse the long question. I really appreciate this network! Please note: It's exactly the same content AND language. In Austria we speak German.

    Read the article

  • Building a small server farm

    - by RayQuang
    Hi, I am planning to set up a Tech startup company that will provide web application solutions. Eventually we hope to diversify into different areas such as possibly social media or other services. For now we plan on running a high demand (from 1000 to 10,000 users in the first year) website running the application. This includes a MySQL database backend, email, and development servers. My question is then, what type of server arrangement will work best, that is ti say should i have a small cluster of ultra high power machines (E.G. Top of the range Xeons, with 12GB RAM) or will it be better to have more less powerful servers load balanced? Should I go for 1 - 2 u servers rack mounted ot would it be beter for it just to be tower servers for maintainability? Finally I would also like to know what kind of Internet and router i would need, I currently have 10mbit down and barely 1 mbit up, but soon our area will have a fiber optic connection with international speeds of up to 25 mbit / sec. Thanks in advance, RayQuang UPDATE: sorry I forgot to mention it, the platform that I will be using is PHP with the APC code cache, Probably running Debian.

    Read the article

< Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >