Search Results

Search found 75233 results on 3010 pages for 'data distribution service'.

Page 302/3010 | < Previous Page | 298 299 300 301 302 303 304 305 306 307 308 309  | Next Page >

  • Improving performance on data pasting 2000 rows with validations

    - by Lohit
    I have N rows (which could be nothing less than 1000) on an excel spreadsheet. And in this sheet our project has 150 columns like this: Now, our application needs data to be copied (using normal Ctrl+C) and pasted (using Ctrl+V) from the excel file sheet on our GUI sheet. Copy pasting 1000 records takes around 5-6 seconds which is okay for our requirement, but the problem is when we need to make sure the data entered is valid. So we have to validate data in each row generate appropriate error messages and format the data as per requirement. So we need to at runtime parse and evaluate data in each row. Now all the formatting of data and validations come from the back-end database and we have it in a data-table (dtValidateAndFormatConditions). The conditions would be around 50. So you can see how slow this whole process becomes since N X 150 X 50 operations are required to complete this whole process. Initially it took approximately 2-3 minutes but now i have reduced it to 20 - 30 seconds. However i have increased the speed by making an expression parser of my own - and not by any algorithm, is there any other way i can improve performance, by using Divide and Conquer or some other mechanism. Currently i am not really sure how to go about this. Here is what part of my code looks like: public virtual void ValidateAndFormatOnCopyPaste(DataTable DtCopied, int CurRow) { foreach (DataRow dRow in dtValidateAndFormatConditions.Rows) { string Condition = dRow["Condition"]; string FormatValue = Value = dRow["Value"]; GetValidatedFormattedData(DtCopied,ref Condition, ref FormatValue ,iRowIndex); Condition = Parse(Condition); dRow["Condition"] = Condition; FormatValue = Parse(FormatValue ); dRow["Value"] = FormatValue; } } The above code gets called row-wise like this: public override void ValidateAndFormat(DataTable dtChangedRecords, CellRange cr) { int iRowStart = cr.Row, iRowEnd = cr.Row + cr.RowCount; for (int iRow = iRowStart; iRow < iRowEnd; iRow++) { ValidateAndFormatOnCopyPaste(dtChangedRecords,iRow); } } Please know my question needs a more algorithmic solution than code optimization, however any answers containing code related optimizations will be appreciated as well. (Tagged Linq because although not seen i have been using linq in some parts of my code).

    Read the article

  • Service for converting SWFs with ActionScript to Video (MPEG, AVI, or MOV)

    - by pcooley
    The SWF files generated by our application are a basic template that reference external resources (images, and textual data) that actionscript uses to fuel the display. Thus the SWF is responsible for the creative layout of the screen the flash player. It is the results of this actions script, images, and textual data that need to be converted to a video format. Is anyone familiar with an online service that would be able to convert such .swf files that our site generates to a video format (say .avi, .mpeg, .mov). Or an application? Note: A more common case might be the conversion of an embedded FLV to a video, but this is not our need

    Read the article

  • How to return ArrayList results from an IntentService

    - by gcl1
    I have an IntentService that loads up an ArrayList with data from a network source (AWS SDB tables). The ArrayList is in a global space -- accessible to both the calling Activity and the IntentService (like this: appState = ((App)getApplicationContext())). When the IntentService is done, it notifies the Activity through a ResultReceiver, and the Activity calls adapter.notifyDataChanged() to update the ListView. This solution works most of the time, ... but it violates the rule that only the UI thread should make changes to data underlying a ListView. So as it is, I sometimes get an error: "The content of the adapter has changed but ListView did not receive a notification." I think this must be a common situation. Please let me know if you have any suggestions or best practices for this problem. Here are three options I'm aware of: Keep the IntentService, and have it store the results in another "working" ArrayList, also in the global space. When the result is ready, the IntentService calls the ResultReceiver (on the UI thread), which can then: a) copy the result to the ArrayList associated with the ListView, and b) call adapter.notifyDataChanged(). CONS: I don't like the idea of putting temp/working data in a global space, and copying the result list seems inefficient. Keep the IntentService, and have it pass the results back through a bundle loaded with a ParcelableArrayList. CONS: I'm not sure if this approach would scale for very large result sets. It also requires copying the result list. Switch to a Service which builds a local copy of the result list. Have the Activity directly access the address space of the Service in order to read the result list. CON: Still requires copying results to the ArrayList associated with the ListView. Thank you.

    Read the article

  • BULK INSERT from one table to another all on the server

    - by steve_d
    I have to copy a bunch of data from one database table into another. I can't use SELECT ... INTO because one of the columns is an identity column. Also, I have some changes to make to the schema. I was able to use the export data wizard to create an SSIS package, which I then edited in Visual Studio 2005 to make the changes desired and whatnot. It's certainly faster than an INSERT INTO, but it seems silly to me to download the data to a different computer just to upload it back again. (Assuming that I am correct that that's what the SSIS package is doing). Is there an equivalent to BULK INSERT that runs directly on the server, allows keeping identity values, and pulls data from a table? (as far as I can tell, BULK INSERT can only pull data from a file) Edit: I do know about IDENTITY_INSERT, but because there is a fair amount of data involved, INSERT INTO ... SELECT is kinda of slow. SSIS/BULK INSERT dumps the data into the table without regards to indexes and logging and whatnot, so it's faster. (Of course creating the clustered index on the table once it's populated is not fast, but it's still faster than the INSERT INTO...SELECT that I tried in my first attempt) Edit 2: The schema changes include (but are not limited to) the following: 1. Splitting one table into two new tables. In the future each will have its own IDENTITY column, but for the migration I think it will be simplest to use the identity from the original table as the identity for the both new tables. Once the migration is over one of the tables will have a one-to-many relationship to the other. 2. Moving columns from one table to another. 3. Deleting some cross reference tables that only cross referenced 1-to-1. Instead the reference will be a foreign key in one of the two tables. 4. Some new columns will be created with default values. 5. Some tables aren’t changing at all, but I have to copy them over due to the "put it all in a new DB" request.

    Read the article

  • Javascript terminates after trying to select data from an object passed to a function

    - by Silmaril89
    Here is my javascript: $(document).ready(function(){ var queries = getUrlVars(); $.get("mail3.php", { listid: queries["listid"], mindex: queries["mindex"] }, showData, 'html'); }); function showData(data) { var response = $(data).find("#mailing").html(); if (response == null) { $("#results").html("<h3>Server didn't respond, try again.</h3>"); } else if (response.length) { var old = $("#results").html(); old = old + "<br /><h3>" + response + "</h3>"; $("#results").html(old); var words = response.split(' '); words[2] = words[2] * 1; words[4] = words[4] * 1; if (words[2] < words[4]) { var queries = getUrlVars(); $.get("mail3.php", { listid: queries["listid"], mindex: words[2] }, function(data){showData(data);}, 'html'); } else { var done = $(data).find("#done").html(); old = old + "<br />" + done; $("#results").html(old); } } else { $("#results").html("<h3>Server responded with an empty reply, try again.</h3>"); } } function getUrlVars() { var vars = [], hash; var hashes = window.location.href.slice(window.location.href.indexOf('?') + 1).split('&'); for (var i = 0; i < hashes.length; i++) { hash = hashes[i].split('='); vars.push(hash[0]); vars[hash[0]] = hash[1]; } return vars; } After the first line in showData: var response = $(data).find("#mailing").html(); the javascript stops. If I put an alert before it, the alert pops up, after it, it doesn't pop up. There must be something wrong with using $(data), but why? Any ideas would be appreciated.

    Read the article

  • MATLAB query about for loop, reading in data and plotting

    - by mp7
    Hi there, I am a complete novice at using matlab and am trying to work out if there is a way of optimising my code. Essentially I have data from model outputs and I need to plot them using matlab. In addition I have reference data (with 95% confidence intervals) which I plot on the same graph to get a visual idea on how close the model outputs and reference data is. In terms of the model outputs I have several thousand files (number sequentially) which I open in a loop and plot. The problem/question I have is whether I can preprocess the data and then plot later - to save time. The issue I seem to be having when I try this is that I have a legend which either does not appear or is inaccurate. My code (apolgies if it not elegant): fn= xlsread(['tbobserved' '.xls']); time= fn(:,1); totalreference=fn(:,4); totalreferencelowerci=fn(:,6); totalreferenceupperci=fn(:,7); figure plot(time,totalrefrence,'-', time, totalreferencelowerci,'--', time, totalreferenceupperci,'--'); xlabel('Year'); ylabel('Reference incidence per 100,000 population'); title ('Total'); clickableLegend('Observed reference data', 'Totalreferencelowerci', 'Totalreferenceupperci','Location','BestOutside'); xlim([1910 1970]); hold on start_sim=10000; end_sim=10005; h = zeros (1,1000); for i=start_sim:end_sim %is there any way of doing this earlier to save time? a=int2str(i); incidenceFile =strcat('result_', 'Sim', '_', a, 'I_byCal_total.xls'); est_tot=importdata(incidenceFile, '\t', 1); cal_tot=est_tot.data; magnitude=1; t1=cal_tot(:,1)+1750; totalmodel=cal_tot(:,3)+cal_tot(:,5); h(a)=plot(t1,totalmodel); xlim([1910 1970]); ylim([0 500]); hold all clickableLegend(h(a),a,'Location','BestOutside') end Essentially I was hoping to have a way of reading in the data and then plot later - ie. optimise the code. I hope you might be able to help. Thanks. mp

    Read the article

  • Ping remote server and wait to get data

    - by infinity
    Hi I'm building my first application for android and I've reached a point where I can't find a solution even have no idea what to search for in Google. So the problem: I am pinging a remote server with GET request through the application passing some parameters like file_id. Then the server gives back confirmation if the file exists or error otherwise, both in plain text. The error string is $$$ERROR$$$. Actually the confirmation is JSON string that holds the path to the file. If the file doesn't exists on the server it generated the error message and start downloading the file and processing it which normally takes 10-30 seconds. What would be the best way to check if the file is ready for download? I have DownloadFile class that extends AsyncTask but before I reach the point to download the file I need the URL which is dependant on the previous request which is in the main class in the UI thread. Here is some code: public class MainActivity extends Activity { private String getInfo() { // Create a new HttpClient and Post Header HttpClient httpClient = new DefaultHttpClient(); HttpGet httpPost = new HttpGet(infoUrl); StringBuilder sb = null; String data; JSONObject jObject = null; try { HttpResponse response = httpClient.execute(httpPost); // This might be equal "$$$ERROR$$$" if no file exists sb = inputStreamToString(response.getEntity().getContent()); } catch(ClientProtocolException e) { // TODO Auto-generated catch block Log.v("Error: pushItem ClientProtocolException: ", e.toString()); } catch (IOException e) { // TODO Auto-generated catch block Log.v("Error: pushItem IOException: ", e.toString()); } // Clean the data to be complaint JSON format data = sb.toString().replace("info = ", ""); try { jObject = new JSONObject(data); data = jObject.getString("h"); fileTitle = jObject.getString("title"); } catch (JSONException e) { // TODO Auto-generated catch block e.printStackTrace(); } downloadUrl = String.format(downloadUrl, fileId, data); return downloadUrl; } } So my idea was to get the content and if equal to $$$ERROR$$$ go into loop until JSON data is passed but I guess there is better solution. Note: I don't have control over the server output so have to deal with what I have.

    Read the article

  • json service from data scraping with php

    - by fredz0003
    I am trying to figure out what is the best way to make this work, I am new to php. I was able to make my script work to find specific data on my htm file with the following script tested on my local server. <?php include ('simple_html_dom.php'); //create DOM from URL or local file $html = file_get_html ('Lotto Texas.htm'); //find td class name currLotWinnum and store in variable winNumbers foreach($html ->find('td.currLotWinnum') as $winNumbers) //print winNumbers echo "<b>The winning numbers are</b><br>"; echo $winNumbers -> innertext . '<br>'; ?> Need some light here, ultimately I would like to create a web service to return json format and access that data from my iOS application using NSJSONSerialization class.

    Read the article

  • Persistence scheme & state data for low memory situations (iphone)

    - by Robin Jamieson
    What happens to state information held by a class's variable after coming back from a low memory situation? I know that views will get unloaded and then reloaded later but what about some ancillary classes & data held in them that's used by the controller that launched the view? Sample scenario in question: @interface MyCustomController: UIViewController { ServiceAuthenticator *authenticator; } -(id)initWithAuthenticator:(ServiceAuthenticator *)auth; // the user may press a button that will cause the authenticator // to post some data to the service. -(IBAction)doStuffButtonPressed:(id)sender; @end @interface ServiceAuthenticator { BOOL hasValidCredentials; // YES if user's credentials have been validated NSString *username; NSString *password; // password is not stored in plain text } -(id)initWithUserCredentials:(NSString *)username password:(NSString *)aPassword; -(void)postData:(NSString *)data; @end The app delegate creates the ServiceAuthenticator class with some user data (read from plist file) and the class logs the user with the remote service. inside MyAppDelegate's applicationDidFinishLaunching: - (void)applicationDidFinishLaunching:(UIApplication *)application { ServiceAuthenticator *auth = [[ServiceAuthenticator alloc] initWithUserCredentials:username password:userPassword]; MyCustomController *controller = [[MyCustomController alloc] initWithNibName:...]; controller.authenticator = auth; // Configure and show the window [window addSubview:..]; // make everything visible [window makeKeyAndVisible]; } Then whenever the user presses a certain button, 'MyCustomController's doStuffButtonPressed' is invoked. -(IBAction)doStuffButtonPressed:(id)sender { [authenticator postData:someDataFromSender]; } The authenticator in-turn checks to if the user is logged in (BOOL variable indicates login state) and if so, exchanges data with the remote service. The ServiceAuthenticator is the kind of class that validates the user's credentials only once and all subsequent calls to the object will be to postData. Once a low memory scenario occurs and the associated nib & MyCustomController will get unloaded -- when it's reloaded, what's the process for resetting up the 'ServiceAuthenticator' class & its former state? I'm periodically persisting all of the data in my actual model classes. Should I consider also persisting the state data in these utility style classes? Is that the pattern to follow?

    Read the article

  • Using a JSON web service from a Java client application

    - by user383341
    I am developing a client-side Java application that has a bit of functionality that requires getting data from some web services that transmit in JSON (some RESTful, some not). No JavaScript, no web browser, just a plain JAR file that will run locally with Swing for the GUI. This is not a new or unique problem; surely there must be some open source libraries out there that will handle the JSON data transmission over HTTP. I've already found some that will parse JSON, but I'm having trouble finding any that will handle the HTTP communication to consume the JSON web service. So far I've found Apache Axis2 apparently which might have at least part of the solution, but I don't see enough documentation for it to know if it will do what I need, or how to use it. Maybe part of the problem is that I don't have experience with web services so I'm not able to know a solution when I see it. I hope some of you can point me in the right direction. Examples would be helpful.

    Read the article

  • grails services :: multiple projects

    - by naveen
    PROBLEM : I have multiple grails projects (lets say appA, appB and appC) : services to be precise I want to run them in a single grails-app.. probably a war deployment, how can i do this? REQUIREMENTS : I want this to be a single app since i am deploying it on cloud and i don't have enough memory to hold all these service instances individually. The reason for multiple grails project is scalability. So that if later on i want to run 10 instance of appA, 3 instance of appB, and 1 instance of aapC; i should be able to do that. EDIT : Can i use something like 0mq, will that be helpful in keeping the services separated from each other. How will i package my service? And reading the docs of 0mq seems that it can work with both inprocess and external process. Will async grails requests on HTTP work with 0mq in process/ external mq calls. Haven't used 0mq, but from the initial doc it seems to work. Need some experience calls in this scenario. Are there any other alternatives or mq alternatives?

    Read the article

  • “It Isn’t Easy At All; Otherwise, Everyone Would Be Doing It”

    - by Kathryn Perry
    A few months ago, JP Saunders (pictured left), who leads the go-to-market initiatives for the Oracle CX Service offering, kicked off a series of articles about modern customer service. He contends that to take care of customers?and the people that support those customers?companies need to make it easy to deliver consistently great experiences. But it’s not easy; it’s an art. The six posts in The Art of Easy series will help you better understand some of the customer service challenges you face and how to avoid common pitfalls. We pulled them all together here in one post for continuity and easy access. Saunders introduces the series with The Art of Easy: Make It Easy To Deliver Great Customer Service Experiences (Part 1). The Art of Easy: Offer Self Service With the Emphasis on Service (Part 2) by David Fulton (pictured left): David Fulton, Director of Product Management, Oracle Service Cloud, shares five tenets of customer self service that move an organization closer to becoming a modern customer service business. Easy Decisions For Complex Problems (Part 3) by Heike Lorenz (pictured right): Heike Lorenz, Director of Global Product Marketing, Policy Automation, writes about automating service policies to ensure that the correct decisions are being applied to the right people. The goal is to nurture the trusted relationships with customers during complex decision-making processes. Moving at the Speed of Easy (Part 4) by Chris Ulmand (pictured left): Chris Omland, Director of Product Management, Oracle Service Cloud, addresses the need for speed to keep up with customers’ expectations. His advice—start with a platform that enables agile innovation, respects a company’s unique needs, and has proven reliability to protect customer relationships. Knowledge Makes It Easy For Everyone (Part 5) by Nav Chakravarti (pictured rig: Vice President Nav Chakravarti, Oracle Service Cloud, talks about managing the knowledge that customers need and want. He coaches readers on delivering answers to customers’ questions easily, in context, with relevance, reliably, and accurately. Making Easy, Both Effective and Efficient (Part 6) by Melinda Uhland (pictured left): Melinda Uhland, Oracle CX Product Management teaches us that happy agents produce happy customers. A Modern Customer Service organization is one that invests in its agents and empowers them with tools to make them efficient and effective, which, in turn, improves customer results.

    Read the article

  • what should be limit to use for IPTABLE rate limiting for a webserver

    - by Registered User
    I see on my webserver some logs as follows 203.252.157.98 - :25:02 "GET //phpmyadmin/ HTTP/1.1" 404 393 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :25:03 "GET //phpMyAdmin/ HTTP/1.1" 404 394 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :25:03 "GET //pma/ HTTP/1.1" 404 388 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :25:04 "GET //dbadmin/ HTTP/1.1" 404 391 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :25:05 "GET //myadmin/ HTTP/1.1" 404 391 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :25:06 "GET //phppgadmin/ HTTP/1.1" 404 394 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :25:06 "GET //PMA/ HTTP/1.1" 404 389 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :25:07 "GET //admin/ HTTP/1.1" 404 389 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :25:08 "GET //MyAdmin/ HTTP/1.1" 404 392 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :27:36 "GET //phpmyadmin/ HTTP/1.1" 404 393 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :27:42 "GET //phpMyAdmin/ HTTP/1.1" 404 394 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :27:42 "GET //pma/ HTTP/1.1" 404 388 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - :27:43 "GET //dbadmin/ HTTP/1.1" 404 391 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" 203.252.157.98 - - "GET //myadmin/ HTTP/1.1" 404 391 "-" "Made by ZmEu @ WhiteHat Team - www.whitehat.ro" and some more as follows 118.219.234.254 - - [19/Oct/2010:22:57:41 "GET /pma/scripts/setup.php HTTP/1.1" 404 399 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:22:57:41 "GET /scripts/setup.php HTTP/1.1" 404 397 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:22:57:42 "GET /sqlweb/scripts/setup.php HTTP/1.1" 404 401 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:22:57:42 "GET /web/phpMyAdmin/scripts/setup.php HTTP/1.1" 404 408 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:22:57:43 "GET /web/phpmyadmin/scripts/setup.php HTTP/1.1" 404 408 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:22:57:44 "GET /web/scripts/setup.php HTTP/1.1" 404 400 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:22:57:44 "GET /webadmin/scripts/setup.php HTTP/1.1" 404 403 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:22:57:45 "GET /webdb/scripts/setup.php HTTP/1.1" 404 401 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:22:57:45 "GET /websql/scripts/setup.php HTTP/1.1" 404 401 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:51 "GET /admin/phpmyadmin/scripts/setup.php HTTP/1.1" 404 407 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:52 "GET /admin/pma/scripts/setup.php HTTP/1.1" 404 404 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:52 "GET /admin/scripts/setup.php HTTP/1.1" 404 401 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:53 "GET /db/scripts/setup.php HTTP/1.1" 404 399 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:54 "GET /dbadmin/scripts/setup.php HTTP/1.1" 404 402 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:54 "GET /myadmin/scripts/setup.php HTTP/1.1" 404 403 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:55 "GET /mysql/scripts/setup.php HTTP/1.1" 404 401 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:55 "GET /mysqladmin/scripts/setup.php HTTP/1.1" 404 405 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:56 "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 404 405 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:56 "GET /phpadmin/scripts/setup.php HTTP/1.1" 404 403 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:57 "GET /phpmyadmin/scripts/setup.php HTTP/1.1" 404 404 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:57 "GET /pma/scripts/setup.php HTTP/1.1" 404 399 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:58 "GET /scripts/setup.php HTTP/1.1" 404 397 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:58 "GET /sqlweb/scripts/setup.php HTTP/1.1" 404 401 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:59 "GET /web/phpMyAdmin/scripts/setup.php HTTP/1.1" 404 408 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:38:59 "GET /web/phpmyadmin/scripts/setup.php HTTP/1.1" 404 408 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:39:00 "GET /web/scripts/setup.php HTTP/1.1" 404 400 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:39:01 "GET /webadmin/scripts/setup.php HTTP/1.1" 404 403 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:39:01 "GET /webdb/scripts/setup.php HTTP/1.1" 404 401 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" 118.219.234.254 - - [19/Oct/2010:05:39:02 "GET /websql/scripts/setup.php HTTP/1.1" 404 401 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98)" I have 2 questions 1) When such an attack happens on my site then while such scanning is going on how do I detect it? (In a very less time) 2)I have decided to rate limit the IPTABLES so as to reduce such DOS attacks by some script kiddies (to scan for vulnerabilities in phpmyadmin or some other script) to some extent.So how much should it be limited so that genuine users do not get kicked out.What is the best practise for question 2?

    Read the article

  • Creating Limited User Accounts on Ubuntu Server

    - by LonnieBest
    Using Ubuntu server, I need to create some user accounts that have the following limitations: (1) User may only view and manipulate files in their home directory. (2) User may only execute commands related to rsync and sftp. I want users to be able to backup files using rsync, and I want them to be able retrieve files using an sftp client like FileZilla. Other than this, I don't want users to be able to view other files on the system, or execute any commands that might mess with the system. I'm more of an Ubuntu Desktop user, and have very little experience administering a linux server. Most tutorials I've found assume I know things that I don't know. So I'm having difficulty setting this up.

    Read the article

  • SQL Server 2000 tables

    - by user40766
    We currently have an SQL Server 2000 database with one table containing data for multiple users. The data is keyed by memberid which is an integer field. The table has a clustered index on memberid. The table is now about 200 million rows. Indexing and maintenance are becoming issues. We are debating splitting the table into one table per user model. This would imply that we would end up with a very large number of tables potentially upto the 2,147,483,647, considering just positive values. My questions: Does anyone have any experience with a SQL Server (2000/2005) installation with millions of tables? What are the implications of this architecture with regards to maintenance and access using Query Analyzer, Enterprise Manager etc. What are the implications to having such a large number of indexes in a database instance. All comments are appreciated. Thanks

    Read the article

  • Automating an SSRS 2008 R2 Report Snapshots and run report with most recent data

    - by Mr Shoubs
    I would like to automate a report snapshot, but there is only an option to take a snapshot in the Report History Tab. All the resources I've found suggest I need to go to processing options and select "Render this report from a snapshot". But I don't want to do that - when I go to a report, I want to get the most recent data. However daily at midnight I'd like to take a snapshot and store it in the history in case I want to compare the reports as of midnight for the last few weeks. Or am I doing this wrong and have to create a subscription instead? Note: this is for an auditing database and has way to much data in to query a range with more than 1 day in it - reports are restricted as such. (1 day has over 1 million rows on it's own).

    Read the article

  • Brainstorm: Flood/DoS/DDoS Attack prevention ideas.

    - by Gnarly
    This is not a question asking how to stop an attack. This is simply a thread for anyone and everyone to discuss ideas for preventing, dealing with, and keeping your server alive during these attacks. Do not discuss using 3rd party software, this is a place to make your own ideas and read others. Post examples if you'd like. Post ideas how to filter out flood attacks. Post ideas how to keep your server alive while being under a heavy DDoS attack.

    Read the article

  • Australian Wifi Hotspot Providers

    - by Dan
    More informational than techy, but are there any Australians (or frequent visitors to Australia) who can tell me who the big players are in providing Wifi hotspots over there? A colleague is going there for a few weeks and will need to connect back remotely, so I wanted to be able to steer him towards some of the bigger names. Thanks,

    Read the article

  • Associating multiple data with a single entry in Open Office Base

    - by idyllhands
    I'm trying to build a database that I can use to track prices of groceries on certain dates. My problem is that I cannot figure out how to have a single entry associate with multiple data. For example, carrots. The index would be carrots. Then, a few categorizing fields (ie, Produce|Vegetable) Then, I can enter a price, date that the price was valid, store that was selling for said price, etc. And the next time I buy carrots, I can just add a new set of pricing data that would be associated with the original carrots entry. I know very little about database building, so if anyone has something I could just modify, I would greatly appreciate it. Alternatively, a step by step tutorial would be great.

    Read the article

  • Need Some comparative data on Apache and IIS

    - by zod
    This is not a pure programming question but very programming related question I am not sure where should i ask. if you dont know answer please dont downvote . If you know the right place to ask please suggest or move this question We have a web application running on PHP 5 , Zend Framework , Apche . I need some comparitive data which states the above technologies and server is one of the best and thousands of domains are using this. Do you know any website i can get this type of data listing major websites developed in PHP. Major domains runs on apache Major website running on Zend A comparitive case study on Apache and IIS or PHP and .Net

    Read the article

  • Stack , data and address space limits on an Ubuntu server

    - by PaulDaviesC
    I am running an Ubuntu server which has around 5000 users. The users are allowed to SSH in to the system. So in order to cap the memory used up by a process I have capped the address space limits using limits.conf. So my question is , should I be limiting the data and stack ? I feel that is not required since I am capping address space. Are there any pitfalls if I do not cap the stack and data limits?

    Read the article

  • How to warehouse data that is not needed from MS SQL server

    - by I__
    I have been asked to truncate a large table in MS SQL Server 2008. The data is not needed but might be needed once every two years. It will NEVER have to be changed, only viewed. The question is, since I don't need the data on a day-to-day basis, what do I do with it to protect and back it up? Please keep in mind that I will need to have it accessible maybe once every two years, and it is FINE for us if the recovery process takes a few hours. The entire table is about 3 million rows and I need to truncate it to about 1 million rows.

    Read the article

  • Using LDAP to store customer data

    - by mechcow
    We wish to store some data in 389 Directory Server LDAP that doesn't fit that well into the standard set of schema's that come with the product. Nothing too amazing, things like: when the customer joined are they currently active customer certificate[1] which environment they are using My question is this: should we register with OID and start writing up our own custom schema OR is there a standard schema definition not provided by Directory Server that we can download and use that would fit our needs? Should we munge/hack existing attributes and store the data among there (I'm strongly opposed to this, but would be interested in arguments about why its better than extending)? [1] I know there is a field for this userCertificate but we don't want to use it to authenticate the user for the purposes of binding Using CentOS 5.5 with 389 Directory Server 8.1

    Read the article

  • AWS EC2 can't execute user-data script

    - by Bloodnut
    I'm pretty new to AWS and EC2 but I want to run instances with a user script after it's booted from another instance. I have installed ec2 tools and ran the command as it's explained in various examples like here http://www.turnkeylinux.org/blog/ec2-userdata and Eric Hammond's tutorials. however when I actually use the command: "ec2-run-instances --key my-key --user-data-file myscript my-ami" it only runs the new instance but doesn't execute the script myscript contains: #!/bin/bash echo "hello" ~/output.txt I'm running ubuntu server 12.04 AMIs. the target AMIs are duplicates of the initiating instance. if I run curl http:// 169.254.169.254/latest/user-data the imported script is there.

    Read the article

  • Can't connect to wireless router anymore due to data rate problem

    - by Jay White
    I was playing around with my wireless router, and switched the mode to a fixed mode B. Now< I can no longer assoicate to the AP. Windows does not give any particular error message, but with wireshark I see that the returned error is that the client does not support the necessary data rate. My wireless card is type n, and it is set to mode a/b/g compatible. I tried setting ot to just b, however this made no difference. How can I set the data rate of my card so that I can connect again to my AP? I would prefer not to just reset the device, as there has been some configuration done that would be a pain to redo, and as well I do not have the ISP password handy. Regardless I would like to understand this situation better.

    Read the article

< Previous Page | 298 299 300 301 302 303 304 305 306 307 308 309  | Next Page >