Search Results

Search found 24879 results on 996 pages for 'prime number'.

Page 574/996 | < Previous Page | 570 571 572 573 574 575 576 577 578 579 580 581  | Next Page >

  • Empty AAAA DNS Record with long TTL?

    - by Joel K
    I pay for a DNS service based on queries per second. We are not using IPv6, but a large number of queries (that I pay for) are coming in for AAAA records. I understand that most DNS stacks will now ask for A and AAAAs at the same time, and that I can't change that. What I /would/ like to do is put something in the AAAA records with a long TTL. (decreasing my hit rate) Is there anything I can put? Null? The equivalent IPv4 Address? Any guidance would be appreciated.

    Read the article

  • Unable to delete files in Temporary Internet Files folder

    - by Johnny
    I'm on Win7. I have a large number of of large .bin files, totaling 183GB, in my Temporary Internet Files folder. They all seem to come from video sharing sites like youtube. The files are invisible in Explorer even after allowing viewing of hidden files. The only way I can see them is by issuing "dir /fs" on the command line. Now when I try to delete them from the command line nothing happens. Trying to delete the whole folder from Explorer results in access denied because another process is using a file in the folder (IE is not running while I'm doing this). Trying to clear the folder using IE is also unsuccessful. How do I delete these files? How did they end up being there without being deleted by IE?

    Read the article

  • Uploading many large files to a remote server

    - by TiernanO
    I am in the process of creating an offsite backup, and need to do a initial load of data. Currently, that's about 400Gb, give or take 10Gb or so... The backup system is producing files which are about 4Gb each, and has some other, smaller related files also. So, i need to transfer all 400ish gigs to a remote server, but how? What is the best method? I have full remote access to the server, so i can install anything i need to install. There are Windows, Linux and a Solaris VM running on the box itself, so any of those can be used there, and i have Windows and Linux at home. I have 2 internet connections in house, 10Mb/s uploading on each, so something that could potentially split the number of connections would be handy (kind of like GetRight, but in reverse... PutRight?).

    Read the article

  • check two conditions in two different columns in excel and count the matches

    - by user1727103
    I've trying to create a Error Log to help me analyse my mistakes. So for simplicity, lets assume I have two columns "Type of Question" - with values SC,RC,CR and another column that indicates whether I got this question "right/wrong".Let's assume this is my table: Question No. | Right/Wrong | Question Type | Right | SC | Right | RC | Wrong | SC | Wrong | CR | Right | RC (Pardon my formatting skills). And I want an output table like this Type of Question | Right | Wrong | Total SC | 1 | 1 | 2 RC | 2 | 0 | 2 CR | 0 | 1 | 1 So basically what I want to do is check Column3 for SC using =COUNTIF(C1:C5,"SC"), and return the total number of SC questions, and then outta the SC , I need to find out which are Right.If I know the right and the total I can get the wrong. I have never written a macro so a formula based answer would suffice.

    Read the article

  • General purpose ticketing/tech support system [closed]

    - by crazybyte
    Possible Duplicate: What’s your favorite ticketing system? I was wondering if somebody could recommend me a very user friendly or simple general purpose ticketing/tech support system. I need something that is web based, preferably open-sourced/free software implemented using PHP, Ruby, Ruby on Rails or Java (as back end) with MySQL or PostgreSQL as database engine. I need something that is not development management oriented or project management oriented like Eventum or similar (random example), something to which the user can connect open a tech support request and be able to follow it until is solved or dropped.I need it to be open-sourced to be able to modify it if there is a need or extend it. I tried a number of such systems available and I found that osTicket or eTicket is something that it's close to what I need, but the code is somewhat flaky and some of the features are working badly or behaving strangely. Any thoughts/advice where to find something similar? Thanks!

    Read the article

  • How can I combine 30,000 images into a timelapse movie?

    - by Swift
    I have taken 30,000 still images that I want to combine into a timelapse movie. I have tried QuickTime Pro, TimeLapse 3, and Windows Movie Maker, but with such a huge amount of images, each of the programs fail (I tried SUPER ©, but couldn't get it to work either...?). It seems that all of these programs crash after a few thousand pictures. The images I have are all in .JPG format, at a resolution of 1280x800, and I'm looking for a program that can put these images into a timelapse movie in some kind of lossless format (raw/uncompressed AVI would be fine) for further editing. Does anyone have any ideas, or has anyone tried anything like this with a similar number of pictures?

    Read the article

  • How much CPU use is too much?

    - by Jonathan Sampson
    I've got a server that receives around a million unique visitors a month and I've recently began using Plesk to help monitor some of the vitals on the box itself. RAM I can make sense of, but I'm not really sure if my CPU usage is too high, low, or about average for this number of visitors. The server exists solely to serve up a somewhat hefty WordPress blog. This is one week. What types of things should I look out for? Some other information about this server follows: VCPU(s): 4, RAM: 6GB, HDD: 30GB, OS: Ubuntu Server 10.04 x86_64

    Read the article

  • Using rspec to check creation of template

    - by Brian
    I am trying to use rspec with puppet to check the generation of a configuration file from an .erb file. However, I get the error 1) customizations should generate valid logstash.conf Failure/Error: content = catalogue.resource('file', 'logstash.conf').send(:parameters)[:content] ArgumentError: wrong number of arguments (0 for 1) # ./spec/classes/logstash_spec.rb:29:in `catalogue' # ./spec/classes/logstash_spec.rb:29 And the logstash_spec.rb: describe "customizations" do let(:params) { {:template => "profiles/logstash/output_broker.erb", :options => {'opt_a' => 'value_a' } } } it 'should generate valid logstash.conf' do content = catalogue.resource('file', 'logstash.conf').send(:parameters)[:content] content.should match('logstash') end end

    Read the article

  • When decomposing a large function, how can I avoid the complexity from the extra subfunctions?

    - by missingno
    Say I have a large function like the following: function do_lots_of_stuff(){ { //subpart 1 ... } ... { //subpart N ... } } a common pattern is to decompose it into subfunctions function do_lots_of_stuff(){ subpart_1(...) subpart_2(...) ... subpart_N(...) } I usually find that decomposition has two main advantages: The decomposed function becomes much smaller. This can help people read it without getting lost in the details. Parameters have to be explicitly passed to the underlying subfunctions, instead of being implicitly available by just being in scope. This can help readability and modularity in some situations. However, I also find that decomposition has some disadvantages: There are no guarantees that the subfunctions "belong" to do_lots_of_stuff so there is nothing stopping someone from accidentally calling them from a wrong place. A module's complexity grows quadratically with the number of functions we add to it. (There are more possible ways for things to call each other) Therefore: Are there useful convention or coding styles that help me balance the pros and cons of function decomposition or should I just use an editor with code folding and call it a day? EDIT: This problem also applies to functional code (although in a less pressing manner). For example, in a functional setting we would have the subparts be returning values that are combined in the end and the decomposition problem of having lots of subfunctions being able to use each other is still present. We can't always assume that the problem domain will be able to be modeled on just some small simple types with just a few highly orthogonal functions. There will always be complicated algorithms or long lists of business rules that we still want to correctly be able to deal with. function do_lots_of_stuff(){ p1 = subpart_1() p2 = subpart_2() pN = subpart_N() return assembleStuff(p1, p2, ..., pN) }

    Read the article

  • New Versions of Whitepapers are available

    - by Anthony Shorten
    The set of whitepapers that are available are progressively being updated and republished to reflect new versions of the products as well new advice for existing customers. A number of whitepapers are now available that have been updated (the My Oracle Support Doc Id is indicated): What’s New in Oracle Utilities Application Framework V4 (Doc Id: 1177265.1) -  This has been updated for the latest facilities in Oracle Utilities Application Framework V4.1. Batch Best Practices (Doc Id: 836362.1) – This has been updated for newer advice including more details of how CLUSTERED mode works, how to migrate to CLUSTERED mode and some configuration examples to cover typical configuration scenarios. Oracle Utilities Application Framework Architecture Guidelines (Doc Id: 807068.1) – This has been updated to reflect additional architecture advice. Performance Troubleshooting Guides (Doc Id: 560382.1) – This has been updated for the latest facilities in Oracle Utilities Application Framework V4.1 and includes additional techniques that have been used by customers to track performance. The whitepapers apply to all Oracle Utilities Application Framework Products which at the present time includes: Oracle Utilities Customer Care And Billing (V2.x) Oracle Enterprise Taxation Management (V2.x) Oracle Utilities Business Intelligence (V2.x) Oracle Utilities Meter Data Management (V2.x) Oracle Utilities Mobile Workforce Management (V2.x) Oracle Utilities Smart Grid Gateway (V2.x) Additional whitepapers and updates will be posted as they are available.

    Read the article

  • A definition for a CPU second?

    - by dude
    Hey, I'm totally behind this topic. Yesterday I was doing profiling for some script I'm working on, and the unit for time spent was a 'CPU second'. Can anyone remind me with the definition of it? For example for some profiling I got: 200.750 CPU seconds. What does that supposed to mean? At other case and for time consuming process I got: -347.977 CPU seconds, a negative number! Is there anyway I can convert that time, to calendar time? Cheers,

    Read the article

  • Massive 404 attack with non existent URLs. How to prevent this?

    - by tattvamasi
    The problem is a whole load of 404 errors, as reported by Google Webmaster Tools, with pages and queries that have never been there. One of them is viewtopic.php, and I've also noticed a scary number of attempts to check if the site is a WordPress site (wp_admin) and for the cPanel login. I block TRACE already, and the server is equipped with some defense against scanning/hacking. However, this doesn't seem to stop. The referrer is, according to Google Webmaster, totally.me. I have looked for a solution to stop this, because it isn't certainly good for the poor real actual users, let alone the SEO concerns. I am using the Perishable Press mini black list (found here), a standard referrer blocker (for porn, herbal, casino sites), and even some software to protect the site (XSS blocking, SQL injection, etc). The server is using other measures as well, so one would assume that the site is safe (hopefully), but it isn't ending. Does anybody else have the same problem, or am I the only one seeing this? Is it what I think, i.e., some sort of attack? Is there a way to fix it, or better, prevent this useless resource waste? EDIT I've never used the question to thank for the answers, and hope this can be done. Thank you all for your insightful replies, which helped me to find my way out of this. I have followed everyone's suggestions and implemented the following: a honeypot a script that listens to suspect urls in the 404 page and sends me an email with user agent/ip, while returning a standard 404 header a script that rewards legitimate users, in the same 404 custom page, in case they end up clicking on one of those urls. In less than 24 hours I have been able to isolate some suspect IPs, all listed in Spamhaus. All the IPs logged so far belong to spam VPS hosting companies. Thank you all again, I would have accepted all answers if I could.

    Read the article

  • How can I password-protect a Mac shared folder on a Windows workgroup?

    - by Phillip Oldham
    We have a Mac-mini running 10.5.8 which already acts as a fileserver for our simple Windows (mixed XP/Vista) workgroup. The Mac-mini is on the same workgroup and the files are shared via SMB, FTP, and AFP. Basic file-sharing is working, and has been for some time. We'd now like to add an additional directory/share which can be secured by a password so that only a small number on the network have access. Is this possible? I've already tried creating the additional folder on the mac system, adding it to the shared folders, and limiting it to a specific "shared user", however it's not possible to log-in from an XP machine. Adding a sub-directory to the currently working share and giving limiting it's access to the shared user doesn't work either.

    Read the article

  • ISP Couldn't Verify My HFC MAC Id

    - by Ann Rahn
    An Internet Service Provider whom I used in the past, claims to provide me with Internet service. However, when I gave his technician the Hybrid Fiber - COX MAC ID on my Modem, he could not find it on his list. Also, the Internet Service Provider doesn't even have my current mailing address. All he has is my phone number and he is demanding payment. On the other hand, I use a cable service which provide TV. At this point, the Internet Service Provider is threatening to disconnect my Internet service. My question: How can I verify if I am getting the service from him or through the cable?

    Read the article

  • Unique string values in range

    - by Dean Smith
    I have some spreadsheets where there are large number of cells that have essentially been used for free text. There is a finite set of values for this free text and most, if not all repeat. eg. A B C D 1 Monkey Gorilla Cat Dog 2 Dog Cat Gorilla Gorilla 3 Dog Dog Dog Cat There are probably 50 or so different cell values spread over multiple sheets and hundreds of rows and columns. I need to analyse this data and count occurancies, which is not a problem other than getting a list of unique values to start with and this has been driving me up the wall. What is the best way to produce this list. So from the above we would have Monkey Dog Cat Gorilla In order of preferred solutions, as this will need to be done monthly. Dynamic formula based VB Script Other ( Advanced filtering or other manual steps )

    Read the article

  • How can I view and sort after the page count for multiple PDF files in a Windows file explorer?

    - by grunwald2.0
    I unsuccessfully used the "pages" feature in Windows Explorer, as well as in Directory Opus 10 and Free Commander XT (which I installed just for that reason, to try it out) to display the page count of multiple PDFs in a folder. All my PDF's are free to edit, i.e. not write-protected. I don't understand why any PDF reader can display the (correct) page number, but none of the file explorers can? (In the "details" view of course.) The only documents whose page count is displayed are MS Word documents. Do I have to use Adobe Bridge? (I didn't try it.) On a side-note: Did that change in Windows 8? Initial research: Google search was unsuccessful, the only slightly related SE topic I found was "How to count pages in multiple PDF files?".

    Read the article

  • Solaris: What is the difference between .so and .so.1 files?

    - by Rob Goretsky
    I am trying to understand how/why certain library files are dynamically loaded by the linker on Solaris. I am using ldd to see this (with the -s switch to see what paths are tried by the linker). As an example, if I run "ldd /usr/local/bin/isql -s" I notice that one of the libraries that is searched for is called "libodbc.so.1". I notice that this does NOT match a file it finds along the way called "libodbc.so". So, it finally resolves to a place where there is a symbolic link between "libodbc.so.1.0.0" and "libodbc.so.1". My question is - what is the significance of the ".1" here? Is it to indicate a version number? Why do some installers create these symbolic links, while others don't?

    Read the article

  • How can I prevent frame breaking in Chrome from Google image searches, etc.?

    - by Nick T
    More often than not, websites with any number of images will use frame breaking scripts to lose Google Image Search's results frame (e.g. this relatively benign case). While I somewhat understand the reasons for doing so (as ineloquently put forth by these people), more often than not, such breakout/redirects dump me to a useless page that doesn't have the image I was looking for, plus it makes going "back" rather irritating as you need to click twice or more (some pages jam you through several redirects it seems) in rapid succession. Other than having reflexes to copy the 'Full-size image' hyperlink quicker than loading the breakout script, is there a way to get my actual result?

    Read the article

  • Designing object oriented programming

    - by Pota Onasys
    Basically, I want to make api calls using an SDK I am writing. I have the following classes: Car CarData (stores input values needed to create a car like model, make, etc) Basically to create a car I do the following: [Car carWithData: cardata onSuccess: successHandler onError: errorHandler] that basically is a factory method that creates instance of Car after making an API call request and populating the new Car class with the response and passes that instance to the successHandler. So "Car" has the above static method to create that car, but also has non-static methods to edit, delete cars (which would make edit, delete API calls to the server) So when the Car create static method passes a new car to the successHandler by doing the following: successHandler([[Car alloc] initWithDictionary: dictionary) The success handler can go ahead and use that new car to do the following: [car update: cardata] [car delete] considering the new car object now has an ID for each car that it can pass to the update and delete API calls. My questions: Do I need a cardata object to store user inputs or can I store them in the car object that would also later store the response from all of the api calls? How can I improve this model? With regards to CarData, note that there might be different inputs for the different API calls. So create function might need to know model, make, etc, but find function might need to know the number of items to find, the limit, the start id, etc.

    Read the article

  • Algorithms for Data Redundancy and Failover for distributed storage system?

    - by kennetham
    I'm building a distributed storage system that works with different storage sizes. For instance, my storage devices have sizes of 50GB, 70GB, 150GB, 250GB, 1000GB, 5 storage systems in one system. My application will store any files to the storage system. Question: How can I build a distributed storage with the idea of data redundancy and fail-over to store documents, videos, any type of files at the same time ensuring that should one of any storage devices fail, there would be another copy of these files on another storage device. However, the concern is, 50GB of storage can only store this maximum number of files as compared to 70GB, 150GB etc. With one storage in mind, bringing 5 storage systems like a cloud storage, is there any logical way to distribute or store the files through my application? How do I ensure data redundancy through different storage sizes? Is there any algorithm to collate multiple blob files into a single file archive? What is the best solution for one cloud storage with multiple different storage sizes? I open this topic with the objective of discussing the best way to implement this idea, assuming simplicity, what are the issues of this implementation, performance measurements and discussion of the limitations.

    Read the article

  • Working with lots of cubes. Improving performance?

    - by Randomman159
    Edit: To sum the question up, I have a voxel based world (Minecraft style (Thanks Communist Duck)) which is suffering from poor performance. I am not positive on the source but would like any possible advice on how to get rid of it. I am working on a project where a world consists of a large quantity of cubes (I would give you a number, but it is user defined worlds). My test one is around (48 x 32 x 48) blocks. Basically these blocks don't do anything in themselves. They just sit there. They start being used when it comes to player interaction. I need to check what cubes the users mouse interacts with (mouse over, clicking, etc.), and for collision detecting as the player moves. Now I had a massive amount of lag at first, looping through every block. I have managed to decrease that lag, by looping through all the blocks, and finding which blocks are within a particular range of the character, and then only looping through those blocks for the collision detection, etc. However, I am still going at a depressing 2fps. Does anyone have any other ideas on how I could decrease this lag? Btw, I am using XNA (C#) and yes, it is 3d.

    Read the article

  • Prioritize file sharing performance in Windows Server 2008

    - by cmbrnt
    I've got a server running Windows Server 2008, and use it mainly for sharing files throughout the domain from a number of disks. It's running on VMware ESXi 4.0, in case that matters. My problem is that when I log in to the server to check user permissions etc, the access speed the files on the remote disks almost grinds to a halt. I havn't been able to measure the speeds, but I would guess it slows down to about 100kB/s as soon as I log in. This is on a gigabit network and the problems are equal for all users, even the ones connected to the same switch as the server. I've assigned 2 GB RAM to the server, and reserved it 1,5Ghz processor power. I don't have to do anything special on the server for this halt to occur. How can I make sure file sharing is prioritized on the server, so no matter what applications I'm using it will always make sure file sharing works properly? Could this be a VMware issue?

    Read the article

  • Most efficient Implementation a Tree in C++

    - by Topo
    I need to write a tree where each element may have any number of child elements, and because of this each branch of the tree may have any length. The tree is only going to receive elements at first and then it is going to use exclusively for iterating though it's branches in no specific order. The tree will have several million elements and must be fast but also memory efficient. My plan makes a node class to store the elements and the pointers to its children. When the tree is fully constructed, it would be transformed it to an array or something faster and if possible, loaded to the processor's cache. Construction and the search on the tree are two different problems. Can I focus on how to solve each problem on the best way individually? The construction of has to be as fast as possible but it can use memory as it pleases. Then the transformation into a format that give us speed when iterating the tree's branches. This should preferably be an array to avoid going back and forth from RAM to cache in each element of the tree. So the real question is which is the structure to implement a tree to maximize insert speed, how can I transform it to a structure that gives me the best speed and memory?

    Read the article

  • Outlook keeps forgetting safe sender domains and ignoring contacts

    - by Jivlain
    I get a large number of quite similar emails from a particular address, most of which Outlook 2010 identifies as spam. None of these are actually spam. I have Outlook's junk email proection set to Low, and have told it to trust email from my contacts. I have added the address to my list of safe senders, and I have also tried adding it to my contacts. However: it keeps dropping the address from my safe senders list - I add it, it stays there for a while, but eventually I'll have a legitimate mail identified as spam, and the address will have been dropped from the list. Meanwhile, despite adding that sender to my contacts, it is still classifying their mail as junk. Any ideas of how to fix this?

    Read the article

  • HP Proliant DL160 G6 - Hardware RAID card to get? [closed]

    - by zhuanyi
    I have bought a DL160 G6 server (Product number: 490427R-001 ) and it does not come with a hardware RAID card. I am trying to set up a VMWare Esxi server and as such I would need a hardware RAID card. I am just wondering if there is any card that would fit into the chassis? Would a P200i fits? How about a P400? Also, would there be any non-HP RAID card that will do the magic too? I have 4 SATA 160GB hard drives already plugged in. Thanks a lot!

    Read the article

< Previous Page | 570 571 572 573 574 575 576 577 578 579 580 581  | Next Page >