Search Results

Search found 29949 results on 1198 pages for 'large scale website'.

Page 73/1198 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • How to scale a PHP application (servers, mysql, memcache)

    - by Stéphane Goetz
    Hi, I'm currently creating a website for a social project in switzerland. And before there is an overflow of user, I want to prepare the application to scale. I answered by myself many questions but some are left. I explain what I want to do. First at the beginnning, the Application will have only one server (short time) with DNS, PHP, Mysql, Data, and memcache. Second Then I will split them in two DNS, Mysql, memcache Data, PHP Third Here is the problem, I don't know how to do it exactly here to keep the application running well. I could do : Front : Load Balancer, memcache, DNS Web 1 : PHP, DATA Web 2 : PHP, DATA Mysql This would be the scheme, all PHP sessions are kept in the DB. BUT, how do I sync the data? do I run a Rsync to keep them up to date. do I put them on a separate disk (network disk) to be sure ? but in this case, how can I do in case of user uploads ? and if the website gets more success and we have to go on greater structures, would'nt it create some latency on updates ? or would it be a good thing to go directly to amazon's web services ? some infos I use codeigniter as Framework. I use linux as webserver (distribution not chosen now, but should be Debian) Thanks in advance for your answers.

    Read the article

  • Optimal way to make MySQL backups for fairly large databases (MyISAM / InnoDB)

    - by WinkyWolly
    Currently we have one beefy MySQL database that runs a couple of high traffic Django based websites as well as some e-commerce websites of decent size. As a result we have a fair amount of large databases using both InnoDB and MyISAM tables. Unfortunately we've recently hit a wall due to the amount of traffic so I've setup another master server to help alleviate reads / backups. Now at the moment I simply use mysqldump with a few arguments and it's proven to be fine.. until now. Obviously mysqldump is a slow quick method however I believe we've outgrown its use. I now need a good alternative and have been looking into utilizing Maatkits mk-parallel-dump utility or an LVM snapshot solution. Succinct short version: I have a fairly large MySQL databases I need to backup Current method using mysqldump is inefficient and slow (causing issues) Looking into something such as mk-parallel-dump or LVM snapshots Any recommendations or ideas would be appreciated - since I have to re-do how we're doing things I rather have it done properly / most efficient :).

    Read the article

  • Desimal data Type Display scale part as zero

    - by Wael Dalloul
    I have Decimal field in SQLserver 2005 table, Price decimal(18, 4) if I write 12 it will be converted to 12.0000, if I write 12.33 it will be converted into 12.3300. Always it's putting zero to the right of the decimal point in the count of Scale Part(4). I was using these in SQL Server 2000, it was not behaving like this, in SQL Server 2000 if I put 12.5 it will be stored as 12.5 not as 12.5000 what SQLServer2005 do. My Question is how to stop SQL Server 2005 from putting zeros to the right of the decimal point?

    Read the article

  • PHP Scale image and then create image of specific size

    - by Ronny vdb
    The result I am trying to achieve is that the user can upload an image of any size, and the server side script will convert it to an image that has the dimension 1024x512. To save the aspect ratio of the image I first scale the image down by either the width or height to match the wanted dimensions. to do this I am using imagecopyresampled(): imagecopyresampled( $new_img, $src_img, $dst_x, $dst_y, 0, 0, $new_width, $new_height, $img_width, $img_height ) && $write_image($new_img, $new_file_path, $image_quality); If we take an image that is 1920x1200 as an example, the outputted image will be 819x512. However I still need a final result of 1024x512 so I want to add white borders to the image to will the width to 1024. How can I correct my imagecopyresampled function to acheive this?

    Read the article

  • IP addresses not listed for IIS website bindings

    - by Svinn
    Recently purchased a windows cloud server godaddy. Now i installed iis7 and all other required software. And i have 50.62.1.89 and 2 more public ips. Also i have a private ip 10.1.0.2. Now the problem is am unable to access any website through any public ip. All my public ips are opening default website only. also i cant see pubic ips for IIS website bindings. Only my private ip listed for IIS binding. And in my server also public opening only default website. But am able to open websites using private ip. But my public ip addresses pointed to my server correctly. am able to open my server using remote desktop using public ip. Also as i said already public ip opening default website from IIS without problem. Please help me. Am confused for last 2 days.

    Read the article

  • reporting tool/viewer for large datasets

    - by FrustratedWithFormsDesigner
    I have a data processing system that generates very large reports on the data it processes. By "large" I mean that a "small" execution of this system produces about 30 MB of reporting data when dumped into a CSV file and a large dataset is about 130-150 MB (I'm sure someone out there has a bigger idea of "large" but that's not the point... ;) Excel has the ideal interface for the report consumers in the form of its Data Lists: users can filter and segment the data on-the-fly to see the specific details that they are interested in - they can also add notes and markup to the reports, create charts, graphs, etc... They know how to do all this and it's much easier to let them do it if we just give them the data. Excel was great for the small test datasets, but it cannot handle these large ones. Does anyone know of a tool that can provide a similar interface as Excel data lists, but that can handle much larger files? The next tool I tried was MS Access, and found that the Access file bloats hugely (30 MB input file leads to about 70 MB Access file, and when I open the file, run a report and close it the file's at 120-150 MB!), the import process is slow and very manual (currently, the CSV files are created by the same plsql script that runs the main process so there's next to no intervention on my part). I also tried an Access database with linked tables to the database tables that store the report data and that was many times slower (for some reason, sqlplus could query and generate the report file in a minute or soe while Access would take anywhere from 2-5 minutes for the same data) (If it helps, the data processing system is written in PL/SQL and runs on Oracle 10g.)

    Read the article

  • how to generate large image in compact framework

    - by Buthrakaur
    I need to generate large images (A4 image at 200 DPI, PNG format would be fine) in my compact framework application. This is impossible to do in standard way due to memory limitations (such big image will throw OOMException). Is there any library which offers file-backed stream image generation? Or I could generate many smaller stripes of images (each stripe representing a row of the large image) using standard Bitmap approach, but I need to merge them together afterwards - is there any method how to merge many smaller images into one large without having to instantiate large Bitmap instance (which would again cause OOM)?

    Read the article

  • set up way of getting mysite.$domain

    - by jose silva
    Hi I have several domains, only one website and one databse table for each domain. example: wbesite.us - data from USA goes to database table main_usa wbesite.co.uk - data form UK goes to database table main_uk Only have one database with name of the website. Having only one website structured, and having variables like $sql="select * from main_".$countrycode." where bla..bla..., and many other variables to catch the domain extension, and so on... Now, instead of having one full website for each domain, how can set a script and wher do I put it in order to detect the domain that the user uses. In my server root do I create something like website.$domain ? Something like website OLX but for different purposes. I hope I made myself clear. Thank you.

    Read the article

  • Syncing Large Directories/Filesystems using USB Drive [closed]

    - by Alan Lue
    Does anyone have a solution for syncing large directories/filesystems using just a USB flash drive (and specifically without using a network connection)? The objective is simply to sync a user directory between two computers. The contents of the user directory could amount to a large quantity of data—say, a quantity larger than could be stored on any single USB drive—but the aggregate size of changes that must be propagated by a single sync could easily fit on a USB drive. As an example, suppose a user directory is already synchronized between a desktop and a laptop computer. Here's a use case: Some changes are made in the user directory on the desktop. We mount a USB drive onto the desktop and copy whatever changes need to be applied to the laptop user directory in order to synchronize the desktop and laptop user directories. We now mount the USB drive onto the laptop and apply the changes. The desktop and laptop user directories are now synchronized. Any ideas? Alan

    Read the article

  • Setting Session/Cookie via ajax request made on other website

    - by user596805
    Hi, That's my problem: I have an website, example.com, in which index.html file a introduced a <script src="website.net/js.js"></script> You can see, that this is on other web server. In the js.js I have some data that I want to send to php. For that, I am using Ajax. So, I made a request to "website.net/data.php" using method get. In data.php file everything is ok,I received the value, but I want to set a cookie which value is what I received through ajax. Here is the problem. The setcookie function says that the cookie was set, but when I check in the browser, there's no cookie! It works fine if the index.html file where I use <script src="website.net/js.js"></script> is hosted on the same domain where I am making the request. If it is on another domain, it doesn't work anymore. I have read something about Ajax cross site, but I don't want to send something back to example.com. All I want is to send some data from example.com to website.net and then setting a cookie based on that value. Thank you very much, and sorry for my English! Later edit: I am not used with this website. From the example.net I take a single value. On website.net I receive that value, I check if it's not already a cookie set, if it's not, I set it. On the same page, website.net, I use this cookie too.

    Read the article

  • Pushing Large Files to 500+ Computers [closed]

    - by WMIF
    I work with a team to manage 500-600 rented Windows 7 computers for an annual conference. We have a large amount of data that needs to be synced to these computers, up to 1 TiB. The computers are divided into rooms and connected through unmanaged gigabit switches. We prepare these computers ahead of time with the Windows installation and configuration, plus any files that we have available to us before we send the base image in for replication by the rental company. Every year, we have presenters approach on site with up to gigs of data that need to be pushed to the room that they will be presenting in. Sometimes they only have a few files that are small sizes, such as a slide PDF, but can sometimes be much larger 5 GiB. Our current strategy for pushing these files is using batch scripts and RoboCopy. For the large pushes, we actually use a BitTorrent client to generate a torrent file, and then we use the batch-RoboCopy to push the torrent into a folder on the remote machines that is being monitored by an installed BT client. Often times, this data needs to be pushed immediately with a small time window. We have several machines in a control room that are identical to the machines on the floor that we use for these pushes. We occasionally have a need to execute a program on the remote machines, and we currently use batch and PSexec to handle this task. We would love to be able to respond to these last minute pushes with "sorry, your own fault", but it won't happen. The BT method has allowed us to have a much faster response time, but the whole batch process can get messy when there are multiple jobs being pushed. We use Enterprise Ghost for other processes, and it doesn't work well in this large of scale, plus it is really quite expensive for a once-a-year task like this. EDIT: There is a hard requirement that the remote machines on the floor are running Windows. The control machines do not have a hard OS requirement. I would really like to stay away from Multicast because of complications with upstream routers. Is Multicast or BitTorrent the better way to go on this? Is there another protocol that might work better?

    Read the article

  • Syncing Large Directories/Filesystems using USB Drive

    - by Alan Lue
    Does anyone have a solution for syncing large directories/filesystems using just a USB flash drive (and specifically without using a network connection)? The objective is simply to sync a user directory between two computers. The contents of the user directory could amount to a large quantity of data—say, a quantity larger than could be stored on any single USB drive—but the aggregate size of changes that must be propagated by a single sync could easily fit on a USB drive. As an example, suppose a user directory is already synchronized between a desktop and a laptop computer. Here's a use case: Some changes are made in the user directory on the desktop. We mount a USB drive onto the desktop and copy whatever changes need to be applied to the laptop user directory in order to synchronize the desktop and laptop user directories. We now mount the USB drive onto the laptop and apply the changes. The desktop and laptop user directories are now synchronized. Any ideas? Alan

    Read the article

  • How does dropbox version/upload large files?

    - by barfoon
    Hey everyone, I have a free dropbox account (2GB), and I was wondering how the versioning of large files works. I have a full backup of all my webfiles that sites @ just over 1GB. After the initial upload of 1GB, everytime it syncs will dropbox figure out the delta of the file, or will it have to upload the entire thing again to version it? It would be cool to always have an up to date version of a large file, but I dont want to kill my bandwidth uploading 1GB everytime. Is this possible? Thanks,

    Read the article

  • Website & Forum sharing the same login credentials ?

    - by Brian
    I am going to be running a small site (100 hits a week maybe) and I am looking for a quick and easy way to share login information between the main website, a control panel (webmin, cpanel, or something), and the forum. One login needed to access any of the three. The website won't have use for the login, per say. But it will display "logged in" when you are on the website. Any custom solutions, any thoughts, logic, examples?

    Read the article

  • Copying large file from SD to HDD via USB failing on Ubuntu

    - by Kent Boogaart
    Hi, I'm attempting to copy some large files from my camera (Canon EOS 500D) to my laptop, which is running 64 bit Ubuntu 9.04. I am using USB to connect the two devices. For most files, it is simply a matter of control-C and control-V. I have done this successfully many times with both photos and small movies (eg. 180MB). However, when I attempt to do this with very large files (eg. 3GB), the copy seems to start with a lot of activity both on the camera and laptop, but after 10 minutes or so the camera is automatically unmounted and the copy fails to complete. I have read that this might be due to the device not mounting as a mass storage device, but I cannot see any obvious way for me to change this behavior. Can anyone offer any direction here? I'll get a USB card reader if necessary, but I'd prefer to be able to just plug my camera in. Thanks

    Read the article

  • Accessing large log files on a unix machine with textpad

    - by Jason
    Hi, I'm interested to access large log files on a unix server with textpad. (textpad for history reasons, i personally prefer ofcourse less awk grep etc) but I have many personal who rather be using textpad they have years of experience with it and can tweak it to do whatever they want. The problem is that if i connect for example with winscp to get the log files to textpad it first fetches the full log and user needs to wait and it bloats etc. I would rather the textpad to somehow access the unix machine and get only the relevant segment of the log file (large log files could be GB) anyone knows how can this be achieved?

    Read the article

  • imagePickerController move and scale does not work

    - by ghostrider
    Here is my code: -(void) takePhoto { UIImagePickerController *imagePickerController = [[UIImagePickerController alloc] init]; imagePickerController.sourceType = UIImagePickerControllerSourceTypeCamera; //imagePickerController.editing = YES; imagePickerController.allowsEditing=YES; imagePickerController.delegate = self; [self presentViewController:imagePickerController animated:YES completion:NULL]; } #pragma mark - Image picker delegate methdos -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { [picker dismissViewControllerAnimated:NO completion:nil]; [self.Picture setImage:[info objectForKey:UIImagePickerControllerOriginalImage]]; } and I have implemented the delegates UINavigationControllerDelegate,UIImagePickerControllerDelegate The image is taken, I can see the move and scale box, but when i move it the box returns to the initial posision - likes it bounces back. Why is that?

    Read the article

  • Windows XP - Website unaccessible on single pc in LAN

    - by DorentuZ
    For serveral days now, a website isn't accessible on a single pc in the LAN. On the other pc's, it works just fine. And it's just a single website that's not accessible as far as I know of. The website generates a timeout on every single web browser I've tried (IE8, Firefox and Chrome). However, traceroute, nmap and telnet all work just fine. I've even tried multiple user accounts and safe mode, but that didn't work either. As a side note: using a linux live cd did work and I could access the website without any problems. The hosts file is the windows default, the ip- and dns settings on the network adapter normal as well. No strange processes are running and no viruses found. According to tcpview and netstat there are connections to the domain, but every request in the browser results in a timeout.. Any idea what's happening?

    Read the article

  • Website is not accessible from server which is using proxy

    - by Bhoot
    I hosted a website in a win 2008 R2 server which runs in private domain. I set up bindings for port 80 and 443 for http & https respectively. Created inbound rule for port 80 and 443 also in windows firewall. After doing all this, i am still not able to access my website from remote machine. IE : Internet Explorer cannot display the webpage. Chrome : Oops! Google Chrome could not find xxxxxx Tried accessing website by ip address but no luck. I tried to ping that server but it says TTL expired in Transit. Now i found some more information over internet to check if the server is using any kind of proxy in between. I found my IP address at www.getip.com, but ipconfig/all gives me a different IP address. Is it really a problem if we use proxy ? I am not sure if i have concluded it correctly. But is there any way out to resolve this issue? Update ::: I figured it out. I have to call that website with external IP address. due to the proxy settings i was not able to call that website by the server's IP or name of that machine.

    Read the article

  • Website project takes a long time to load in VS.NET 2008

    - by rm
    After getting a new computer (a lot faster than the one I've used before) my Solutions take A LONG TIME (3-4 minutes) to load up in VS.NET 2008. I only have 2 projects in the solution: DB Project and Website Project (from IIS). If I remove the Website Project from Solution - it loads up instantly, when I add that same website project to the OPEN Solution - it loads up instantly. The only time it's slow is when I open the solution referencing my Website project. I had the exact same setup (as far as VS is concerned) on my old box, and never had this problem. Any ideas?

    Read the article

  • Using Large Arrays in VB.NET

    - by Tim
    I want to extract large amounts of data from Excel, manipulate it and put it back. I have found the best way to do this is to extract the data from an Excel Range in to a large array, change the contents on the array and write it back to the Excel Range. I am now rewriting the application using VB.NET 2008/2010 and wish to take advantage of any new features. Currently I have to loop through the contents of the array to find elements with certain values; also sorting large arrays is cumbersome. I am looking to use the new features, including LINQ to manipulate the data in my array. Does anybody have any advice on the easiest ways to filter / query, sort etc. data in a large array. Also what are the reasonable limits to the size of the array? ~Many Thanks

    Read the article

  • can't open large rss remote files

    - by anarchOi
    I'm setting up a rss feed in vbulletin to import the entries into the forum. The rss is very large (2000+ entries) and it is hosted on a different server (i have control on it). Problem is that i cannot add this feed into vBulletin, i am getting a weird error. If i edit the feed and remove half of the entries then it will work correctly, which makes me think it is because the file is too large. I suppose i have to change a setting in php.ini or something to allow bigger files to be opened, but what do i need to look for ? Thanks I'm on Debian.

    Read the article

  • SEO Marketing - How to Promote Your Website and Gain More Traffic?

    Having problems in promoting your website? Do your risk everything to put your website on top with weak SEO marketing strategy? SEO Marketing is a very important part in promoting your website and to market your products. It will help you gain more traffic to your website and increase your page rank. However, it will be only a waste of money if your website has weak seo marketing strategy. Remember that people nowadays use the internet to gain any information in any website or probably your website.

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >