Search Results

Search found 6589 results on 264 pages for 'photo upload'.

Page 234/264 | < Previous Page | 230 231 232 233 234 235 236 237 238 239 240 241  | Next Page >

  • OsmDroid show blank screen with blocks

    - by Lennie
    Am trying to use OsmDroid with MapQuest maps downloaded from Mobile Atlas Creator. I followed all the instructions to generate the map tiles, upload them to the SDcard etc but when I run this on the device I get a screen with a bunch of empty boxes... What am I doing wrong? > @Override > public void onCreate(Bundle savedInstanceState) { > super.onCreate(savedInstanceState); > setContentView(R.layout.osm_map); > mapView = (MapView) findViewById(R.id.mapview); > mapView.setTileSource(TileSourceFactory.MAPQUESTOSM); > mapView.setBuiltInZoomControls(true); > mapView.setUseDataConnection(false); > mapController = mapView.getController(); > mapController.setZoom(15); > } > protected boolean isRouteDisplayed() { > // TODO Auto-generated method stub > return false; > }

    Read the article

  • Qt 4.x: how to implement drag-and-drop onto the desktop or into a folder?

    - by Jeremy Friesner
    Hi all, I've written a little file-transfer application written in C++ using Qt 4.x ... it logs into a server, shows the user a list of files available on the server, and lets the user upload or download files. This all works fine; you can even drag a file in from the desktop (or from an open folder), and when you drop the file icon into the server-files-list-view, the dropped file gets uploaded to the server. Now I have a request for the opposite action as well... my users would like to be able to drag a file out of the server-files-list-view and onto the desktop, or into an open folder window, and have that file get downloaded into that location. That seems like a reasonable request, but I don't know how to implement it. Is there a way for a Qt application to find out the directory corresponding to where "drop event" occurred, when the icon was dropped onto the desktop or into an open folder window? Ideally this would be a Qt-based platform-neutral mechanism, but if that doesn't exist, then platform-specific mechanisms for MacOS/X and Windows (XP or higher) would suffice. Any ideas?

    Read the article

  • Storing class constants (for use as bitmask) in a database?

    - by fireeyedboy
    Let's say I have a class called Medium which can represent different types of media. For instance: uploaded video embedded video uploaded image embedded image I represent these types with contants, like this: class MediumAbstract { const UPLOAD = 0x0001; const EMBED = 0x0010; const VIDEO = 0x0100; const IMAGE = 0x1000; const VIDEO_UPLOAD = 0x0101; // for convenience const VIDEO_EMBED = 0x0110; // for convenience const IMAGE_UPLOAD = 0x1001; // for convenience const IMAGE_EMBED = 0x1010; // for convenience const ALL = 0x1111; // for convenience } Thus, it is easy for me to do a combined search on them on an (abstract) repository, with something like: { public function findAllByType( $type ) { ... } } $media = $repo->findAllByType( MediumAbstract::VIDEO | MediumAbstract::IMAGE_UPLOAD ); // or $media = $repo->findAllByType( MediumAbstract::ALL ); // etc.. How do you feel about using these constant values in a concrete repository like a database? Is it ok? Or should I substitute them with meaningful data in the database. Table medium: | id | type | location | etc.. ------------------------------------------------- | 1 | use constants here? | /some/path | etc.. (Of course I'll only be using the meaningful constants: VIDEO_UPLOAD, VIDEO_EMBED, IMAGE_UPLOAD and IMAGE_EMBED)

    Read the article

  • Connecting to a network drive programmatically and caching credentials

    - by Chris Doggett
    I'm finally set up to be able to work from home via VPN (using Shrew as a client), and I only have one annoyance. We use some batch files to upload config files to a network drive. Works fine from work, and from my team lead's laptop, but both of those machines are on the domain. My home system is not, and won't be, so when I run the batch file, I get a ton of "invalid drive" errors because I'm not a domain user. The solution I've found so far is to make a batch file with the following: explorer \\MACHINE1 explorer \\MACHINE2 explorer \\MACHINE3 Then manually login to each machine using my domain credentials as they pop up. Unfortunately, there are around 10 machines I may need to use, and it's a pain to keep entering the password if I missed one that a batch file requires. I'm looking into using the answer to this question to make a little C# app that'll take the login info once and login programmatically. Will the authentication be shared automatically with Explorer, or is there anything special I need to do? If it does work, how long are the credentials cached? Is there an app that does something like this automatically? Unfortunately, domain authentication via the VPN isn't an option, according to our admin. EDIT: If there's a way to pass login info to Explorer via the command line, that would be even easier using Ruby and highline.

    Read the article

  • PHP: Problem with filesize

    - by BlaM
    I need a little help here: I get a file from an HTML upload form. And I have a "target" filename in $File. When I do this: copy($_FILES['binfile']['tmp_name'], $File); echo '<hr>' . filesize($_FILES['binfile']['tmp_name']); echo '<hr>' . filesize($File); Everything works fine. I get the same number twice. However when I delete the first call of filesize(), I get "0" (zero). copy($_FILES['binfile']['tmp_name'], $File); echo '<hr>' . filesize($File); Any suggestions? What am I doing wrong? Why do I need to get the filesize of the "original" file before I can get the size of the copy? (That's actually what it is: I need to call the filesize() for the original file. Neither sleep() nor calling filesize() of another file helps.) System: Apache 2.0 PHP 5.2.6 Debian Linux (Lenny)

    Read the article

  • WCF Service timeout in Callback

    - by Muckers Mate
    I'm trying to get to grips with WCF, in particular writing a WCF Service application with callback. I've setup the service, together with the callback contract but when the callback is called, the app is timing out. Essentially, from a client I'm setting a property within the service class. The Setter of this property, if it fails validation fires a callback and, well, this is timing out. I realise that this is probably to it not being an Asynchronous calback, but can someone please show me how to resolve this? Thanks // The call back (client-side) interface public interface ISAPUploadServiceReply { [OperationContract(IsOneWay = true)] void Reply(int stateCode); } // The Upload Service interface [ServiceContract(CallbackContract = typeof(ISAPUploadServiceReply))] public interface ISAPUploadService { int ServerState { [OperationContract] get; [OperationContract(IsOneWay=true)] set; And the implementation... public int ServerState { get { return serverState; } set { if (InvalidState(Value)) { var to = OperationContext.Current.GetCallbackChannel<ISAPUploadServiceReply>(); to.Reply(eInvalidState); } else serverState = value; } }

    Read the article

  • Investigating the root cause behind SharePoint's "request not found in the TrackedRequests"

    - by Muhimbi
    We have a long standing issue in our bug tracking system about the dreaded "ERROR: request not found in the TrackedRequests. We might be creating and closing webs on different threads." message in SharePoint's trace log. As we develop Workflow software for the SharePoint market, we look into this issue from time to time to make sure it is not caused by our products. I have personally come to the conclusion that this is a problem in SharePoint, but perhaps someone else can prove me wrong. Here is what I know: According to the hundreds of search results returned by Google on this topic, this issue appears to be mainly related to SharePoint Workflows, both SharePoint Designer and Visual Studio based workflows. Assuming ULS logging is set to Monitorable, the easiest way to reproduce this problem is to create a new SharePoint Designer Workflow, attach it to a document library, set it to auto start on add/update, don't add any actions, save the workflow and upload a file to the document library. The error is only visible in the SharePoint trace log, it does not appear to impact the execution of the workflow at hand. I have verified that the problem occurs on 32 bit as well as 64 bit systems, Win2K3 and 2K8, WSS and MOSS with SharePoint versions up to the December 2009 Cumulative Update (6524). The problem does not occur when a workflow is started manually. There are dozens of related posts on MSDN Forums, hundreds on Google, one on StackOverflow and none on SharePoint Overflow. There appears to be no answer. Does anyone have any idea about what is going on, what is causing this and if we should worry or file this under 'Red Herrings'.

    Read the article

  • Apache tomcat7 + couchdb in the same host

    - by demotics2002
    I couldn't find any guide on the internet about how to make them work together. I found some couchdb tutorials but they are mostly having the web pages hosted in couchdb's own webserver. My requirement: 1. Use tomcat 7 (or other versions) - i will be using jsp for the website. It has some features that require file upload and processing of files, file generation, and etc., that will require java. It also has admin console that will require the next item, 2. ExtJS (maybe V4) - I will be needing this in the admin console page for restful access to couchdb and other ui components (sorry but I am not considering jquery at the moment because I am already familiar with . 3. Couchdb - because the client needs a dynamic structure of data. Now my question is how to make tomcat and couchdb run on the same host (and port of course)? As much as possible I would like to avoid making my pages doing cross domain js calls. Worst case I may have to create a servlet that overrides put|get|post|delete that calls couchdb (either by using a driver or httpclient).

    Read the article

  • Weird URL parse issue. (Android)

    - by Tarmon
    I am attempting to parse in a URL to a KML file from maps.google.com. When I try and use this link: http://maps.google.com/maps/ms?ie=UTF8&hl=en&msa=0&msid=112748174025829638330.000483ad6315714cc941d&z=13&output=kml` I am unable to overlay this KML file on my MapView. If I were to take the KML file that I get from following this link and upload it to my Dropbox it will work just fine. I think there may be something about the URL from Google that it doesn't like? Link from dropbox: http://dl.dropbox.com/u/1037184/Blue_original.kml Also it would be better if we could just save these KML files locally and pass them in the same way but I can't figure out a way to do this. Here is the code I am using: Intent mapIntent = new Intent(Intent.ACTION_VIEW); Uri uri1 = Uri.parse("geo:0,0?q=http://code.google.com/apis/kml/ documentation/KML_Samples.kml"); mapIntent.setData(uri1); startActivity(Intent.createChooser(mapIntent, "Test")); The URL used in this example also works. So to recap: I am curious as to why some URLs work and others don't. Is there a way to place this KML file locally on the device and pass it to a Uri object? Any other suggestions? Thanks, Rob

    Read the article

  • Use $_FILES on a page called by .ajax

    - by RachelD
    I have two .php pages that I'm working with. Index.php has a file upload form that posts back to index.php. I can access the $_FILES no problem on index.php after submitting the form. My issue is that I want (after the form submit and the page loads) to use .ajax (jQuery) to call another .php file so that file can open and process some of the rows and return the results to ajax. The ajax then displays the results and recursively calls itself to process the next batch of rows. Basically I want to process (put in the DB etc) the csv in chunks and display it for the user in between chunks. Im doing it this way because the files are 400,000+ rows and the user doesnt want to wait the 10+ min for them all to be processed. I dont want to move this file (save it) because I just need to process it and throw it away and if a user closes the page while its processing the file wont be thrown away. I could cron script it but I dont want to. What I would really like to do is pass the (single) $_FILES through .ajax OR Save it in a $_POST or $_SESSION to use on the second page. Is there any hope for my cause? Heres the ajax code if that helps: function processCSV(startIndex, length) { $.ajax({ url: "ajax-targets/process-csv.php", dataType: "json", type: "POST", data: { startIndex: startIndex, length: length }, timeout: 60000, // 1000 = 1 sec success: function(data) { // JQuery to display the rows from the CSV var newStart = startIndex+length; if(newStart <= data['csvNumRows']) { processCSV(newStart, length); } } }); } processCSV(1, 2); }); P.S. I did try this Passing $_FILES or $_POST to a new page with PHP but its not working for me :( SOS.

    Read the article

  • Firefox: Can I use a relative path in the BASE tag?

    - by Aaron Digulla
    I have a little web project where I have many pages and an index/ToC file. The toc file is at the root of my project in toc.html. The pages are spread over a couple of subdirectories and include the toc with an iframe. The project doesn't need a web server, so I can create the HTML in a directory and browse it in my browser. The problem is that I'm running into XSS issues when JavaScript from the toc.html wants to call a function in a page (violation of the same origin policy). So I added base tags in the header with a relative URL to the directory in which toc.html. This works for Konqueror but in Firefox, I have to use absolute paths or the toc won't even display :( Here is an example: <?xml version='1.0' encoding='utf-8' ?> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <base href="../" target="_top" /> <title>Project 1</title> </head> <body> <iframe class="toc" frameborder="0" src="toc.html"> </iframe> </body> </html> This is file is in a subdirectory page. Firefox won't even load it, saying that it can't find page/toc.html. Is there a workaround? I would really like to avoid absolute paths in my export to keep it the same everywhere (locally and when I upload it on the web server later).

    Read the article

  • Best way to pass image to server?

    - by Chris
    I have an SL3 application that needs to be able to pass an image to the server, and then the server will generate a PDF file with the image in it, and display it to the user. What I already have in place are the following: (1) Code to convert image to byte array (2) Code to generate PDF file with image The main problem that I am running into is the following: In order to bypass the pop-up blocker, which is a requirement for my application, I am using the following code: var button = new NavigationButton(); button.NavigateUri = new Uri("http://localhost:3616/PrintReport.aspx?ReportIndex=11&ActionType=Get&ReportIdentifier=" + reportIdentifier.ToString() + ""); button.TargetName = "_blank"; button.PerformClick(); Initially, I would pass the image to a WCF web service (as a byte array), and then "navigate" to the ASP.NET page that would display the report. However, if I do this, then I can not use my custom HyperlinkButton class, and, certain browsers, including Safari, will block a new window from opening up. Therefore, it appears that the only option is to use the HyperlinkButton class. What I need to be able to do is to somehow pass the image, as a byte array or some other data type, to the server, such that it can temporarily store the image, even if it is in a server variable, and then immediately retrieve it when I navigate to the PrintReport.aspx page. If I upload the image to an ASP.NET form and then use the HyperlinkButton class to navigate to the PrintReport page, it doesn't work, as the app navigates to the PrintReport page before the system has finished uploading the image. I can't pass it to a web service, as that would require that I navigate to the PrintReport.aspx page in the callback code of the web method that I would be passing the image to, and the HyperlinkButton will not allow that, based on security rules. Any help or ideas would be appreciated. Thanks. Chris

    Read the article

  • Using SVN with a MySQL database ran by xamp - yes or no? (and how?)

    - by Extrakun
    For my current PHP/MySQL project (over a group of 4 to 5 team members), we are using this setup: each developer codes and test on his localhost running xamp, and upload to a test server via SVN. One question that I have now is how to synchronize the MySQL database? I may have added a new table to project and the PHP code references to it, so my other team members would need to access that table for my code (once they got it through SVN) to work. We are not always working in the same office all the time, so having a LAN and a MySQL server in the office is not feasible. So I am toying with 2 solutions Setup a test DB online, and have all the coders will reference to that, even when coding from localhost. Downside: you can't test if you happen not have internet access. Somehow sync the localhost copy of MySQL DB. Is that kind of silly? And if I do consider this, how do I do it? (which folder do I add to SVN?) (I guess a related question is how to automatically update the live MySQL DB from the testing DB, regardless if it is on a remote server or hosted locally via xamp. Any advice regarding that would be welcomed!)

    Read the article

  • C# ASP.NET FILE TRANSFER FROM LOCAL MACHINE TO ANOTHER MACHINE

    - by Imcl
    I basically want to transfer a file from the client to the file storage server without actual login to the server so that the client cannot access the storage location on the server directly. I can do this only if i manually login to the storage server through windows login. I dont want to do that. This is a Web-Based Application. Using the link below, I wrote a code for my application. I am not able to get it right though, Please refer the link and help me ot with it... http://stackoverflow.com/questions/263518/c-uploading-files-to-file-server The following is my code:- protected void Button1_Click(object sender, EventArgs e) { filePath = FileUpload1.FileName; try { WebClient client = new WebClient(); NetworkCredential nc = new NetworkCredential(uName, password); Uri addy = new Uri("\\\\192.168.1.3\\upload\\"); client.Credentials = nc; byte[] arrReturn = client.UploadFile(addy, filePath); Console.WriteLine(arrReturn.ToString()); } catch (Exception ex) { Console.WriteLine(ex.Message); } } The following line doesn't execute... byte[] arrReturn = client.UploadFile(addy, filePath); THIS IS THE ERROR I GET :- "An exception occurred during a WebClient request"

    Read the article

  • problem with ajax on the hosting server

    - by nelly
    when I Implemented chatting Function , I use Ajax to send messages between file to another . so , it is working well on local host . but , when I upload it in to remote server it doesn't work. can U tell me ,why ? is an Ajax need Special configuration ? there is my files that I used : Ajax .js file witch has "ajax_send" function that i used in chatbox.js file chatbox.js file wich consest of functions i used it to send data from php file to another one and it display the state (any user sign in or new sending message and so on ..) user.php file whitch responseble to write user name in the text file usersonline.txt and then display the online users in the online users column. send.php file that write on room1.text recive.php file that read room1.txt and then write the content into the chat box I beleve that the problem comes from the ajax code in Ajax.js File so please help me find out the problem and how to solve it. is ajax needs special server settings ? because it was working at the local host !

    Read the article

  • Get ID in GridView

    - by Romil
    Hi, I have One grid say Grid 1 in which there are some columns. There is one view image button, one delete image button and one column which says that color column is Red or Blue. If color column is Red the deleted button is hidden else its shown (Based on user given rights to delete a column or not). Now a user clicks a view button for Red Color Column. If this condition is satisfied, then i want that delete icon should not be present in Grid 2. Grid 2 has 2 columns. One is deleted image button and one is file name (which is uploaded via upload control). So If in Grid One "View Image Button" is clicked for "Red" Column i should be able to hide the delete button from Grid 2. I have tried by writing code in Item command but i am not able to access control of grid2. is this the correct way? Or else suggest me some correct way. Please Make sure that code is compatible with VS 2003. let me know if more inputs are needed. Thanks

    Read the article

  • Updating permissions on Amazon S3 files that were uploaded via JungleDisk

    - by Simon_Weaver
    I am starting to use Jungle Disk to upload files to an Amazon S3 bucket which corresponds to a Cloudfront distribution. i.e. I can access it via an http:// URL and I am using Amazon as a CDN. The problem I am facing is that Jungle Disk doesn't set 'read' permissions on the files so when I go to the corresponding URL in a browser I get an Amazon 'AccessDenied' error. If I use a tool like BucketExplorer to set the ACL then that URL now returns a 200. I really really like the simplicity of dragging files to a network drive. JungleDisk is the best program I've found to do this reliably without tripping over itself and getting confused. However it doesn't seem to have an option to make the files read-able. I really don't want to have to go to a different tool (especially if i have to buy it) to just change the permissions - and this seems really slow anyway because they generally seem to traverse the whole directory structure. JungleDisk provides some kind of 'web access' - but this is a paid feature and I'm not sure if it will work or not. S3 doesn't appear to propagate permissions down which is a real pain. I'm considering writing a manual tool to traverse my tree and set everything to 'read' but I'd rather not do this if this is a problem someone else has already solved.

    Read the article

  • NSMutableURLRequest wont turn on 3G

    - by Kyle
    I won't bother posting any code because my code works fine when I launch Safari or something that turns on the 3G connection first. When I put it Airport mode, then put back OFF, I will get the behavior of a connection error saying No internet connection with NSMutableURLRequest. I've personally mailed apple about how the Ant gets turned on and off, and they said anything that uses their base CFSocketStream object will turn on the Ant, and I quote: "such as NSURLRequest".. BSD Sockets do not, just so everyone knows.. However, I assumed NSMutableURLRequest would fall into the category of 'Does', but it seems like it doesnt. It never succeeds for me when I cycle Airport or have the phone idle for a long time. It will ALWAYS succeed when I open any other network app first to turn the Ant on. Shall I be forced to do a dummy NSURLRequest call to turn this on; or has anyone been able to get this class working just fine? Remember this is just an implementation of NSMutableURLRequest to upload a file Async, and no other Network operations are performed anywhere; no ads, no version checks, no nothing.

    Read the article

  • Access Denied while using System.Diagnostics.Process

    - by Mike C
    I am trying to use the unmanaged ImageMagick library in my ASP.NET application from the command line using System.Diagnostics.Process. Basically, users will upload an .eps file to the site, and then I will run the command line command to convert it into .jpg. This is the code I'm using to try and run the command: Dim proc As New System.Diagnostics.Process proc.StartInfo.RedirectStandardOutput = True proc.StartInfo.RedirectStandardError = True proc.StartInfo.FileName = "C:\Program Files (x86)\ImageMagick-6.6.1-Q16\convert.exe" proc.StartInfo.UseShellExecute = False proc.StartInfo.Arguments = String.Format("{0} {1}", Server.MapPath("~/logo/test.eps"), _ Server.MapPath("~/certificates/temp/test-1234.jpg")) proc.StartInfo.CreateNoWindow = True proc.Start() I am able to run this code just fine on our development Win 2k3 server, but not on our production Win 2k3 Server. I get the error "System.ComponentModel.Win32Exception: Access is denied". The main between the two servers is that the production is 64-bit and runs Plesk to manage multiple domains. I've tried adding rights asp.net user to the ImageMagick directory. The PS Admin says that in the case of Plesk, it's the same account that I use to access the site in VS using FPE. Does anyone know what I might do in order to allow this process to run on my production server? Thanks, Mike

    Read the article

  • Mysqli query only works on localhost, not webserver

    - by whamo
    Hello. I have changed some of my old queries to the Mysqli framework to improve performance. Everything works fine on localhost but when i upload it to the webserver it outputs nothing. After connecting I check for errors and there are none. I also checked the php modules installed and mysqli is enabled. I am certain that it creates a connection to the database as no errors are displayed. (when i changed the database name string it gave the error) There is no output from the query on the webserver, which looks like this: $mysqli = new mysqli("server", "user", "password"); if (mysqli_connect_errno()) { printf("Can't connect Errorcode: %s\n", mysqli_connect_error()); exit; } // Query used $query = "SELECT name FROM users WHERE id = ?"; if ($stmt = $mysqli->prepare("$query")) { // Specify parameters to replace '?' $stmt->bind_param("d", $id); $stmt->execute(); // bind variables to prepared statement $stmt->bind_result($_userName); while ($stmt->fetch()) { echo $_userName; } $stmt-close(); } } //close connection $mysqli-close(); As I said this code works perfectly on my localserver just not online. Checked the error logs and there is nothing so everything points to a good connection. All the tables exists as well etc. Anyone any ideas because this one has me stuck! Also, if i get this working, will all my other queries still work? Or will i need to make them use the mysqli framework as well? Thanks in advance.

    Read the article

  • Using hg repository as web site

    - by Tex
    This is somewhat related to my security question here. Is it a bad idea to use an hg / mercurial repository for a live website? If so, why? Furthermore, we have dev, test and production installations of our website, like dev.example.com, test.example.com and www.example.com. If it's a bad idea to use a repository for a live/production website, would it be OK to use an hg repository for the dev and test sites? I'm also concerned about ease of deployment. We have technical and less technical co-workers who will be working with the site. The technical guys (software engineers) won't have any problem working with the command line or TortoiseHG. I'm more concerned about the less technical guys (web designers). They won't be comfortable working on the command line, and may even find TortoiseHG daunting. These guys mostly upload .css files and images to the server. I'd like for these files (at least the .css files) to be under version control, but I want this to be as transparent as possible for the non technical guys. What's the best way to achieve this? Edit: Our 'site' is actually a multi-site CMS setup with a main repository and several subrepositories. Mock-up of the repository structure: /root [main repository containing core files and subrepositories] /modules [modules subrepository] /sites/global [subrepository for global .css and .php files] /sites/site1 [site1 subrepository] ... /sites/siteN [siteN subrepository] Software engineers would work in the root, modules and sites/global repositories. Less technical guys (web designers) would work only in the site1 ... siteN subrepositories.

    Read the article

  • Uploading file with metadata

    - by Dilse Naaz
    Hi Could you help me for how to adding a file to the sharepoint document library? I found some articles in net. but i didn't get the complete concept of the same. Now i uploaded a file without metadata by using this code. if (fuDocument.PostedFile != null) { if (fuDocument.PostedFile.ContentLength > 0) { Stream fileStream = fuDocument.PostedFile.InputStream; byte[] byt = new byte[Convert.ToInt32(fuDocument.PostedFile.ContentLength)]; fileStream.Read(byt, 0, Convert.ToInt32(fuDocument.PostedFile.ContentLength)); fileStream.Close(); using (SPSite site = new SPSite(SPContext.Current.Site.Url)) { using (SPWeb webcollection = site.OpenWeb()) { SPFolder myfolder = webcollection.Folders["My Library"]; webcollection.AllowUnsafeUpdates = true; myfolder.Files.Add(System.IO.Path.GetFileName(fuDocument.PostedFile.FileName), byt); } } } } This code is working as fine. But i need to upload file with meta data. Please help me by editing this code if it possible. I created 3 columns in my Document library..

    Read the article

  • Full-text search on App Engine with Whoosh

    - by Martin
    I need to do full text searching with Google App Engine. I found the project Whoosh and it works really well, as long as I use the App Engine Development Environement... When I upload my application to App Engine, I am getting the following TraceBack. For my tests, I am using the example application provided in this project. Any idea of what I am doing wrong? <type 'exceptions.ImportError'>: cannot import name loads Traceback (most recent call last): File "/base/data/home/apps/myapp/1.334374478538362709/hello.py", line 6, in <module> from whoosh import store File "/base/data/home/apps/myapp/1.334374478538362709/whoosh/__init__.py", line 17, in <module> from whoosh.index import open_dir, create_in File "/base/data/home/apps/myapp/1.334374478538362709/whoosh/index.py", line 31, in <module> from whoosh import fields, store File "/base/data/home/apps/myapp/1.334374478538362709/whoosh/store.py", line 27, in <module> from whoosh import tables File "/base/data/home/apps/myapp/1.334374478538362709/whoosh/tables.py", line 43, in <module> from marshal import loads Here is the import I have in my Python file. # Whoosh ---------------------------------------------------------------------- sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..', 'utils'))) from whoosh.fields import Schema, STORED, ID, KEYWORD, TEXT from whoosh.index import getdatastoreindex from whoosh.qparser import QueryParser, MultifieldParser Thank you in advance for your help!

    Read the article

  • How to make new file permission inherit from the parent directory?

    - by Wai Yip Tung
    I have a directory called data. Then I am running a script under the user id 'robot'. robot writes to the data directory and update files inside. The idea is data is open for both me and robot to update. So I setup the permission and owner group like this drwxrwxr-x 2 me robot-grp 4096 Jun 11 20:50 data where both me and robot belongs to the 'robot-grp'. I change the permission and the owner group recursively like the parent directory. I regularly upload new files into the data directory using rsync. Unfortunately, new files uploaded does not inherit the parent directory's permission as I hope. Instead it looks like this -rw-r--r-- 1 me users 6 Jun 11 20:50 new-file.txt When robot tries to update new-file.txt, it fails due to lack of file permission. I'm not sure if setting umask helps. In anycase the new files does not really follow it. $ umask -S u=rwx,g=rx,o=rx I'm often confounded by Unix file permission. Do I even have a right plan? I'm using Debian lenny.

    Read the article

  • WCF, IIS6.0 (413) Request Entity Too Large.

    - by Andrew Kalashnikov
    Hello, guys. I've got annoyed problem. I've got WCF service(basicHttpBinding with Transport security Https). This service implements contract which consists 2 methods. LoadData. GetData. GetData works OK!. My client received pachage ~2Mb size without problems. All work correctly. But when I try load data by bool LoadData(Stream data); - signature of method I'll get (413) Request Entity Too Large. Stack Trace: Server stack trace: ? ServiceModel.Channels.HttpChannelUtilities.ValidateRequestReplyResponse(HttpWebRequest request, HttpWebResponse response, HttpChannelFactory factory, WebException responseException, ChannelBinding channelBinding) System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout) System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout) System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout) I try this http://blogs.msdn.com/jiruss/archive/2007/04/13/http-413-request-entity-too-large-can-t-upload-large-files-using-iis6.aspx. But it doesn't work! My server is 2003 with IIS6.0. Please help.

    Read the article

< Previous Page | 230 231 232 233 234 235 236 237 238 239 240 241  | Next Page >