Search Results

Search found 29636 results on 1186 pages for 'fine uploader'.

Page 3/1186 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Dealing with Fine-Grained Cache Entries in Coherence

    - by jpurdy
    On occasion we have seen significant memory overhead when using very small cache entries. Consider the case where there is a small key (say a synthetic key stored in a long) and a small value (perhaps a number or short string). With most backing maps, each cache entry will require an instance of Map.Entry, and in the case of a LocalCache backing map (used for expiry and eviction), there is additional metadata stored (such as last access time). Given the size of this data (usually a few dozen bytes) and the granularity of Java memory allocation (often a minimum of 32 bytes per object, depending on the specific JVM implementation), it is easily possible to end up with the case where the cache entry appears to be a couple dozen bytes but ends up occupying several hundred bytes of actual heap, resulting in anywhere from a 5x to 10x increase in stated memory requirements. In most cases, this increase applies to only a few small NamedCaches, and is inconsequential -- but in some cases it might apply to one or more very large NamedCaches, in which case it may dominate memory sizing calculations. Ultimately, the requirement is to avoid the per-entry overhead, which can be done either at the application level by grouping multiple logical entries into single cache entries, or at the backing map level, again by combining multiple entries into a smaller number of larger heap objects. At the application level, it may be possible to combine objects based on parent-child or sibling relationships (basically the same requirements that would apply to using partition affinity). If there is no natural relationship, it may still be possible to combine objects, effectively using a Coherence NamedCache as a "map of maps". This forces the application to first find a collection of objects (by performing a partial hash) and then to look within that collection for the desired object. This is most naturally implemented as a collection of entry processors to avoid pulling unnecessary data back to the client (and also to encapsulate that logic within a service layer). At the backing map level, the NIO storage option keeps keys on heap, and so has limited benefit for this situation. The Elastic Data features of Coherence naturally combine entries into larger heap objects, with the caveat that only data -- and not indexes -- can be stored in Elastic Data.

    Read the article

  • Site inaccessible by some people, fine for others [on hold]

    - by Paul Howell
    A couple of days ago my website www.howellphoto.com (hosted by one.com, wordpress site) started loading really slowly, and I have been unable to access any pages linked from the homepage. Several of my friends have found the same issue, yet many are able to access the site without problem. Live support at one.com have not been all that much help, requesting the ip addresses of a few people who cannot access the site, and saying it could be a firewall issue. Wordpress support (my site was created in prophotoblogs) have been better and have updated all plugins, etc, but can see no issue from their end. My main issue is that even if there was a local fix that I could do on my computer, this would not help wih any potential customers visiting my site for information! This is driving me crazy!!! Any help will be legendary! Cheers, Paul

    Read the article

  • Internet Not Working... Initially Worked fine

    - by Sneha429
    I am using an HP Pavilion desktop, not sure of exact model, etc. Initially saucy looked and worked great. It's my parents, and after two days of no issues, they started having more than a few. The mostly use the internet to access email and videos. I believe Google Chrome was the only application they used. Then, the internet stopped working. Restarting didn't fix it, neither did turning the internet on and off. The computer went from lightening fast to inexplicably slow. I've read some other posts about the internet not working, but none seemed applicable. Any suggestions?

    Read the article

  • Every flash uploader giving bad progress values.

    - by Mike Boers
    The file upload script I wrote early last year for an internal website has been misbehaving oddly on a number of machines. On some machines it consistently works fine, on others it consistently misbehaves. I am having exactly the same problem with YUI Uploader, SWFUpload (2.2 and 2.5a), and Uploadify. On the misbehaving machines, the progress event (or callback as the case may be) is reporting the upload going far too quickly. It is progressing around 9 or 10MB/s, instead of the 50 or 60kb/s that is actually going on. The progress bar fills up very quickly, and then no more progress events are triggered. A few minutes later the completion event will trigger when the upload is actually done. I must emphasize that the file upload does proceed normally, even though the progress being reported is very wrong. The progress events are reporting a correct file size, but the reported amount uploaded is usually way too high, and it appears that it is always a multiple of 2^16 (65536). I'm only having this problem with Firefox 3.5 on Windows XP, all of which have various subversions of Flash 10. Has anyone heard of this happening, or have any idea what is going on? (I'm off to go file a number of bug reports, but hopefully someone here has some previous experience with this.)

    Read the article

  • ustream and justin.tv don't work, UTube and bitgravity are fine

    - by scottstonehouse
    Can anyone explain this? On this laptop, ustream and justin.tv don't work - just get a blank screen. But UTube and bitgravoty work fine (also flash based). 64-bit windows 7 - but I have 64-bit windows 7 on my desktop machine and everything works fine there. Good place to test is here http://live.twit.tv/ where the bitgravity feeds work fine, but ustream and justin.tv don't.

    Read the article

  • ustream and justin.tv don't work, YouTube and bitgravity are fine

    - by scottstonehouse
    Can anyone explain this? On this laptop, ustream and justin.tv don't work - just get a blank screen. But YouTube and bitgravity work fine (also flash based). 64-bit windows 7 - but I have 64-bit windows 7 on my desktop machine and everything works fine there. Good place to test is here http://live.twit.tv/ where the bitgravity feeds work fine, but ustream and justin.tv don't.

    Read the article

  • Scheduled task does not run on WIndows 2003 server on VMWare unattened, runs fine otherwise

    - by lnm
    Scheduled task does not run on Windows 2003 server on VMWare. The same setup runs fine on standalone server. Test below explains the problem. We really need to run a more complex bat file, but this shows the issue. I have a bat file that copies a file from server A to server B. I use full path name, no drive mapping. Runs fine on server B from command prompt. I created a task that runs this bat file under a domain id with password that is part of administrator group on both servers. Task runs fine from Scheduled task screen, and as a scheduled task as long as somebody is logged into the server. If nobody is logged in, the task does not run. There is no error message in Task Scheduler log, just an entry that the task started, bit no entry for finish or an error code. To add insult to injury, if the task copies a file in the opposite direction, from server B to server A, it runs fine as a scheduled unattended task. If I copy a file from server B to server B, the task also runs fine unattended, I recreated exactly the same setup on a standalone server. No issues at all. I checked obvious things like the task has "run only as logged in" unchecked, domain id has run as a batch job privilege and logon rights, Task Scheduler service runs as a local system, automatic start. Any suggestions?

    Read the article

  • Cannot upload media via Wordpress uploader

    - by Justin Johnson
    This has to do with media uploading in Wordpress. Every time WP creates a folder for new uploads (it organizes uploads by year and month: yyyy/mm), it creates it with the "apache:apache' user and group, with full access to all (777 or drwxrwxrwx). However, after that, WP cannot create a folder within that folder (e.g.: mkdir 2011 succeeds, but mkdir 2011/01 fails). Also, uploads cannot be moved into these newly created folders even though the permissions are 777 (rwxrwxrwx). Once a month, I have to chown the newly created folders to be the same as user:group as the rest of the files. Once I do that, uploading works fine (which doesn't make sense to me The really frustrating part is that this problem doesn't exist in other WP installs on other domains on the same server. * I wasn't sure if this should be here or on serverfault. Edit: The containing directory /.../httpdocs/blog/wp-content/uploads has the correct ownership drwxrwxrwx 5 myuser psaserv 4096 Jun 3 18:38 uploads This is a Plesk/CentOS environment hosted by Media Temple (dv). I've written the following test script to simulate the problem <pre><?php $d = "d" . mt_rand(100, 500); var_dump( get_current_user(), $d, mkdir($d), chmod($d, 0777), mkdir("$d/$d"), chmod("$d/$d", 0777), fileowner($d), getmyuid() ); The script always creates the first directory mkdir($d) successfully. On domain A, where the WP problem is, it cannot create the nested directory mkdir("$d/$d"). However, on domain B, both directories are successfully created. I am running each script at /var/www/vhosts/domainA/httpdocs/tmp/t.php and /var/www/vhosts/domainB/httpdocs/tmp/t.php respectively I checked the permissions on tmp, httpdocs, and domain[AB] and they are the same for each path. The only thing that differs is the user.

    Read the article

  • FineUploader multiple instances and dynamic naming

    - by RichieMN
    I am using FineUploader 4.0.8 within a MVC4 project using the jquery wrapper. Here is an example of my js code that creates a single instance of the fineUploader and is working just fine. At this time, I have the need for more than one instance of fineUploader, but each individual control doesn't know anything about the other and they're rendered as needed on a page (I've seen previous questions using a jQuery .each, which won't work here). Currently, I can't seem to correctly render any upload buttons, probably due to having an ID duplicated or something. See below for how I'm using MVC's Razor to create unique variables and html IDs for that individual instance. Here's my current implementation where I've added the dynamic values (places where you see _@Model.{PropertyName}): // Uploader control setup var [email protected] = $('#[email protected]').fineUploader({ debug: true, template: '[email protected]', button: $("#[email protected]"), request: { endpoint: '@Url.Action("UploadFile", "Survey")', customHeaders: { Accept: 'application/json' }, params: { [email protected]: (function () { return instance; }), [email protected]: (function () { return surveyItemResultId; }), [email protected]: (function () { return itemId; }), [email protected]: (function () { return loopingCounter++; }) } }, validation: { acceptFiles: ['image/*', 'application/xls', 'application/pdf', 'text/csv', 'application/vnd.openxmlformats-officedocument.spreadsheetml.template', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet', 'application/vnd.ms-excel'], allowedExtensions: ['jpeg', 'jpg', 'gif', 'png', 'bmp', 'csv', 'xls', 'xlsx', 'pdf', 'xlt', 'xltx', 'txt'], sizeLimit: 1024 * 1024 * 2.5, // 2.5MB stopOnFirstInvalidFile: false }, failedUploadTextDisplay: { mode: 'custom' }, multiple: true, text: { uploadButton: 'Select your upload file(s)' } }).on('submitted', function (event, id, filename) { $("#modal-overlay").fadeIn(); $("#modal-box").fadeIn(); [email protected]++; $(':input[type=button], :input[type=submit], :input[type=reset]').attr('disabled', 'disabled'); }).on('complete', function (event, id, filename, responseJSON) { [email protected]++; if ([email protected] == [email protected]) { $(':input[type=button], :input[type=submit], :input[type=reset]').removeAttr('disabled'); //$("#overlay").fadeOut(); $("#modal-box").fadeOut(); $("#modal-overlay").fadeOut(); } }).on('error', function (event, id, name, errorReason, xhr) { //$("#overlay").fadeOut(); alert('error: ' + errorReason); $("#modal-box").fadeOut(); $("#modal-overlay").fadeOut(); }); }); Using the same principle as above, I've added this logic to the template too. Here's part of my template: <script type="text/template" id="[email protected]"> <div class="qq-uploader-selector qq-uploader"> <div class="qq-upload-drop-area-selector qq-upload-drop-area qq-hide-dropzone"> <span>Drop files here to upload</span> What I see when I use the above code is only the drag and drop section visible, and no button. There are no js errors either. I do have an example that only has one instance of this control on it and the results are the same visible drag and drop section and no button). Any thought as to what's going on? I'll gladly update the question if you find I'm missing something helpful. Please don't just -1 me if it's something I can easily improve or fix.

    Read the article

  • PHP not working when accessed through a domain name, but works fine when accessed through IP

    - by Allister
    Done a basic setup of Ubuntu Server installing Apache, PHP and mysql through tasksel. When I browse to the IP address of the server it works fine and allows me to render PHP scripts fine. So I added a DNS entry for the server onto my local DNS server, calling it webdev.lazer.net When I go to this domain name through my browser it renders HTML documents fine but if I try to view PHP scripts it doesn't render and downloads in plain text to the browser (As if the PHP parser isn't rendering .php documents). I'm sure its some rookie mistake, but any help would be appreciated. Thanks

    Read the article

  • PHPMyAdmin running very slow over internet but fine locally

    - by columbo
    I connect to PHPMyAdmin remotely on a Centos server using my local PC via Firefox. Usually it's fine but today it's really slow (2 minutes to load a page), sometimes timing out. Other connections to the server are fine. The SSH command line is as fast as ever as is the GNOME dekstop over SSH. In fact on the GNOME desktop I can run PHPMyAdmin locally from its browser and it's as quick as ever (which is a solution to the problem of course). I've checked the various log files and seen nothing unusual, I've logged into the MySQL command line and the database is running fine without any slowing what so ever. So it just seems to be slow when I access PHPMyAdmin on the server from the browser on my remote PC (I've tried IE and Firefox, both are slow). Has anyone experienced this or have any ideas what the issue could be. Connecting via CLI through tunnel works OK - problem is in phpMyAdmin for sure. Cheers

    Read the article

  • Cakephp file upload problem.

    - by vatismarty
    I am using Cakephp as my framework. I have a problem in uploading my files through a form. I am using an Uploader plugin from THIS website. My php ini file has this piece of code. upload_max_filesize = 10M post_max_size = 8M this is from uploader.php -- plugin file has var $maxFileSize = '5M'; //default max file size In my controller.php file, i use this code to set max file size as 1 MB at runtime. function beforeFilter() { parent::beforeFilter(); $this->Uploader->maxFileSize = '1M'; } In the uploader.php, we perform this ... if (empty($this->maxFileSize)) { $this->maxFileSize = ini_get('upload_max_filesize'); //landmark 1 } $byte = preg_replace('/[^0-9]/i', '', $this->maxFileSize); $last = $this->bytes($this->maxFileSize, 'byte'); if ($last == 'T' || $last == 'TB') { $multiplier = 1; $execTime = 20; } else if ($last == 'G' || $last == 'GB') { $multiplier = 3; $execTime = 10; } else if ($last == 'M' || $last == 'MB') { $multiplier = 5; $execTime = 5; } else { $multiplier = 10; $execTime = 3; } ini_set('memore_limit', (($byte * $multiplier) * $multiplier) . $last); ini_set('post_max_size', ($byte * $multiplier) . $last); //error suspected here ini_set('upload_tmp_dir', $this->tempDir); ini_set('upload_max_filesize', $this->maxFileSize); //landmark 2 EXPECTED RESULT: When i try uploading a file that is 2MB of size, it shouldn't take place because maxFileSize is 1MB at run time. So upload should fail. THE PROBLEM IS : But it is getting uploaded. Landmark 1 does not get executed. (in comments)... land mark 2 does not seem to work... upload_max_filesize does not get the value from maxFileSize. Please help me... thank you

    Read the article

  • 3 Servers, 2 Work Fine, One Has Network Issues

    - by ScaleOvenStove
    i have 3 servers, all relatively the same hardware/config, etc. I run some data pulls on all 3, and on 2 of them, they have 1 nic, and they work fine. On the other , there are 2 nics, and unless they are both plugged in or teamed, the processes time out. Any ideas on why this would be? It doesn't make sense to me, as the other two work fine with 1 nic and don't time out when running the same processes.

    Read the article

  • Rails 3 Flash Uploader

    - by klynch
    I am trying to get Uploadify working with Rails 3. However, I can't insert the middleware with the correct arguments. This is the Rails 2 way: ActionController::Dispatcher.middleware.insert_before( ActionController::Session::CookieStore, FlashSessionCookieMiddleware, ActionController::Base.session_options[:key] ) This is what I have so far for Rails 3: Rails.application.config.middleware.insert_before( Rails.application.config.session_store, FlashSessionCookieMiddleware, Rails.application.config.session_options[:key] ) However, this gives: kevin$hephaestus:$exposure [1035 | 0]% rake middleware (in /Users/kevin/Projects/exposure) rake aborted! protected method `session_options' called for #<Rails::Application::Configuration:0x101eb28d0> (See full trace by running task with --trace) zsh: exit 1 rake middleware When I comment out the session_options argument, the middleware is successfully inserted, but it can't do what it is supposed to. Any suggestions?

    Read the article

  • Free image uploader with CKEditor

    - by Zakaria
    Hi everybody, I'm using CKEditor to allow the users sending nice rich texts. The problem is that, with the very new version, we must couple it with CKFinder (which requires a license key if deployed) to allow the users "exploring the server" or downloading the images. Is there another free plugin than CKFinder? Should I go back to the FCKEditor rather than the CKEditor? Thank you, Regards.

    Read the article

  • Issues with RegularExpressionValidator in VB .NET 2005 using ASP File Uploader

    - by JFV
    I'm looking to validate a single word: detail (upper/lower/mix-case) prior to submitting my VB .NET 2005 page. I used Regex Builder and the below code validates, but it's not working in my web page... Does anyone have any ideas? Input file location: <input id="btnBrowseForFile" runat="server" enableviewstate="true" name="btnBrowseForFile" style="width: 500px" type="file" /> <asp:RequiredFieldValidator ID="RequiredFieldValidator2" runat="server" ControlToValidate="btnBrowseForFile" ErrorMessage="*Please select an input file." Display="Dynamic"></asp:RequiredFieldValidator> <asp:RegularExpressionValidator ID="RegularExpressionValidator1" runat="server" ControlToValidate="btnBrowseForFile" Display="Dynamic" ErrorMessage='*Please select a file that contains the word "detail"' ValidationExpression="(\b|\s|\w)(d|D)(e|E)(t|T)(a|A)(i|I)(l|L)(\s|\b|\w)"></asp:RegularExpressionValidator>&nbsp; Thanks!!! JFV

    Read the article

  • Explorer right-click uploader for CMS?

    - by Pekka
    I'm looking for a "right-click upload" application like RightLoad - an application that can upload media files to a remote FTP server from the Windows Explorer's context menu. I want to customize the application to serve as a customized image uploading tool to a PHP-based CMS. The user would upload images and other media files to a defined FTP account (I'm also very open for other methods of transport, as long as they are supported by run-off-the-mill web hosting stacks) that they could then use in the CMS they log in to. For me to be able to do these customizations, the application would have to be Open Source - RightLoad is "only" Freeware. Alternatively, I'm open for closed-source and commercial suggestions as long as they allow "pre-packaged" server settings that can easily be deployed to the user. Does anybody know such a tool compatible with at least the most current versions of Windows (XP, Vista, 7)?

    Read the article

  • Ruby on Rails: restrict file type with Paperclip using a flash uploader

    - by aperture
    I have a pretty basic Paperclip Upload model that is attached to a User model through has_many, and am using Uploadify to do the actual uploading. Flash sends all files with the content type of "application/octet-stream" so using validates_attachment_content_type rejects all files. In my create action, I am able to get the mime-type from the original file name, but only after it's been saved, with: def coerce(params) h = Hash.new h[:upload] = Hash.new h[:upload][:attachment].content_type = MIME::Types.type_for(h[:upload][:attachment].original_filename).to_s ... end and def create diff_params = coerce(params) @upload = Upload.new(diff_params[:upload]) ... end What would be the best way of white listing file types? I am thinking a before_validation method, but I'm not sure how that would work. Any ideas would be welcome.

    Read the article

  • best method for background uploader in Android

    - by Dr.Dredel
    Problem: I want to write a process that will allow a user to take photos with the device and for those photos to then be uploaded to some listener in the cloud. The user should not have to do anything to initiate the upload, a background listener would just watch the folder and as long as it finds files in it it would upload them and delete them. Two problems: 1) how to keep the program running in the background even after the user is no longer taking pictures (and if they reboot the device for it to wake up and finish the uploads, if any remain) 2) assuming the connection is spotty (as it always is) how to verify that a given image has completed its upload, and if not, to resubmit it. I don't need any code examples, I just would like opinions on the best strategy to get this implemented. I was going to use Apache commons and just do an upload to a PHP, but am not sure what sort of error checking exists to take into account a connection drop mid file. TIA.

    Read the article

  • Static file download from browser breaking in varnish but works fine in Apache

    - by Ron
    I would at first like to thank everyone at serverfault for this great website and I also come to this site while searching in google for various server related issues and setups. I also have an issue today and so I am posting here and hope that the seniors would help me out. I had setup a website on a dedicated server a few days ago and I used Varnish 3 as the frontend to Apache2 on a Debian Lenny server as the traffic was a bit high. There are several static file downloads of around 10-20 MB in size in the website. The website looked fine in the last few days after I setup. I was checking from a 5mbps + broadband connection and the file downloads were also completed in seconds and working fine. But today I realized that on a slow internet connection the file downloads were breaking off. When I tried to download the files from the website using a browser then it broke off after a minute or so. It kept on happening again and again and so it had nothing to do with the internet connection. The internet connection was around 512 kbps and so it was not dial up level speed too but decent speed where files should easily download though not that fast. Then I thought of trying out with the apache backend port and used the port number to check out if the problem occurs. But then on adding the apache port in the static file download url, the files got downloaded easily and did not break even once. I tried it several times to make sure that it was not a coincidence but every time I was using the apache port in the file download url then it was downloading fine while it was breaking each time with the normal link which was routed through Varnish I suppose. So, it seems Varnish has somehow resulted in the broken file downloads. Could anyone give any idea as to why it is happening and how to fix the problem. For more clarification, take this example: Apache backend set on port 8008, Varnish frontend set on port 80 Now when I download say http://mywebsite.com/directory/filename.extension Then the download breaks off after a minute or so. I cannot be sure it is due to the time or size though and I am just assuming. May be some other reason too. But when I download using: http://mywebsite.com:8008/directory/filename.extension Then the file download does not break at all and it gets download fine. So, it seems that varnish is somehow creating the file download breaking and not apache. Does anybody have any idea as to why it is happening and how can it be fixed. Any help would be highly appreciated. And my varnish default.vcl is backend apache { set backend.host = "127.0.0.1"; set backend.port = "8008"; } sub vcl_deliver { remove resp.http.X-Varnish; remove resp.http.Via; remove resp.http.Age; remove resp.http.Server; remove resp.http.X-Powered-By; }

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >