Search Results

Search found 11780 results on 472 pages for 'firefox profile'.

Page 447/472 | < Previous Page | 443 444 445 446 447 448 449 450 451 452 453 454  | Next Page >

  • GWT combobox not displaying correctly

    - by James
    Hi, I am using GWT with GWT-EXT running in glassfish. I create 2 combo boxes as follows: import com.extjs.gxt.ui.client.widget.form.ComboBox; import com.extjs.gxt.ui.client.widget.form.SimpleComboBox; this.contentPanel = new ContentPanel(); this.contentPanel.setFrame(true); this.contentPanel.setSize((int)(Window.getClientWidth()*0.95), 600); this.contentPanel.setLayout(new FitLayout()); initWidget(this.contentPanel); SimpleComboBox<String> combo = new SimpleComboBox<String>(); combo.setEmptyText("Select a topic..."); combo.add("String1"); combo.add("String2"); this.contentPanel.add(combo); ComboBox combo1 = new ComboBox(); combo1.setEmptyText("Select a topic..."); ListStore topics = new ListStore(); topics.add("String3"); topics.add("String4"); combo.setStore(topics); this.contentPanel.add(combo1); When these are loaded in the browser (IE 8.0, Firefox 3.6.6 or Chrome 10.0) the combo boxes are shown but don't have the pull down arrow. They look like a text field with the "Select a topic..." text. When you select the text it disappears and if you type a character and then delete it the options are shown (i.e. pull down is invoked) however, there is still no pull down arrow. Does anyone know what the issue might be? Or how I can investigate further? Is it possible to see the actual HTML the browser is getting, when I View Page Source I only get the landing page HTML. As an additional I also have a import com.google.gwt.user.client.ui.Grid that does not render correctly. It is in table format but has no grid lines or header bar etc. Cheers, James

    Read the article

  • Ensuring a divs content has changed over time in Selenium

    - by Stewart Robinson
    I have a div that contains a slideshow. ( http://film.robinhoodtax.org/issues/education ) I am using Selenium to test it. So far I have been using the HTML/Selenium script below to validate that the slideshow is actually working. But my assertEval is failing. How can I correctly detect the slideshow and make sure it is moving?. You can see I've approached this by storing the HTML and waiting then trying again but it isn't working. <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"> <head profile="http://selenium-ide.openqa.org/profiles/test-case"> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> <link rel="selenium.base" href="" /> <title>New Test</title> </head> <body> <table cellpadding="1" cellspacing="1" border="1"> <thead> <tr><td rowspan="1" colspan="3">New Test</td></tr> </thead><tbody> <tr> <td>open</td> <td>/issues/education</td> <td></td> </tr> <tr> <td>waitForPageToLoad</td> <td></td> <td></td> </tr> <tr> <td>assertElementPresent</td> <td>css=div[id='views-nivo-slider-ImagesGallery-block_1']</td> <td></td> </tr> <tr> <td>storeHtmlSource</td> <td>css=div[id='views-nivo-slider-ImagesGallery-block_1']</td> <td>first</td> </tr> <tr> <td>pause</td> <td>3000</td> <td></td> </tr> <tr> <td>storeHtmlSource</td> <td>css=div[id='views-nivo-slider-ImagesGallery-block_1']</td> <td>second</td> </tr> <tr> <td>assertEval</td> <td>${first} == ${second}</td> <td>second</td> </tr> </tbody></table> </body> </html>

    Read the article

  • Trying to style the first tbody different than others without introducing another class.

    - by mwiik
    I have a table with multiple tbody's, each of which has a classed row, and I want it so that the classed row in the first tbody has style differences, but am unable to get tbody:first-child to work in any browser. Perhaps I am missing something, or maybe there is a workaround. Ideally, I would like to provide the programmers with a single tbody section they can use as a template, but will otherwise have to add a class to the first tbody, making for an extra test in the programming. The html is straightforward: <tbody class="subGroup"> <tr class="subGroupHeader"> <th colspan="8">All Grades: Special Education</th> <td class="grid" colspan="2"><!-- contains AMO line --></td> <td><!-- right 100 --></td> </tr> <tr>...</tr> <!-- several more rows of data --> </tbody> There are several tbody's per table. I want to style the th and td's within tr.subGroupHeader in the very first tbody differently than the rest. Just to illustrate, I want to add a border-top to the tr.subGroupHeader cells. The tr.subGroupHeader will be styled with a border-top, such as: table.databargraph.continued tr.subGroupHeader th, table.databargraph.continued tr.subGroupHeader td { border-top: 6px solid red; } For the first tbody, I am trying: table.databargraph.continued tbody:first-child tr.subGroupHeader th { border-top: 6px solid blue ; } However, this doesn't seem to work in any browser (I've tested in Safari, Opera, Firefox, and PrinceXML, all on my Mac) Curiously, the usually excellent Xyle Scope tool indicates that the blue border should be taking precedence, though it obviously is not. See the screenshot at http://s3.amazonaws.com/ember/kUD8DHrz06xowTBK3qpB2biPJrLWTZCP_o.png This screenshot shows (top left) the American Indian th is selected, and (bottom right), shows (via black instead of gray text for the css declaration), that, indeed, the blue border should be given precedence. Yet the border is red. I may be missing something fundamental, like pseudo-classes not working for tbodys at all... This really only needs to work in PrinceXML, and maybe Safari so I can see what I'm doing with webkit-based css tools. Note I did try a selector like tr.subGroupHeader:first-child, but such tr's apparently consider the tbody the parent (as I would suspect), thus made every border blue. Thanks...

    Read the article

  • Problem with jQuery accordion and cookies

    - by rayne
    I have a jQuery accordion from this site and edited for my purposes, but the accordion only works in Firefox (not Safari or Chrome) and the cookies aren't being set correctly. This is the jQuery: function initMenus() { $('#sidebar .letter_index').hide(); $.each($('#sidebar .letter_index'), function() { var cookie = $.cookie(this.id); if (cookie === null || String(cookie).length < 1) { $('#sidebar .letter_index:first').show(); } else { $('#' + this.id + ' .' + cookie).next().show(); } }); $('#sidebar .letter_head').click(function() { var checkElement = $(this).next(); var parent = this.parentNode.parentNode.id; if ((checkElement.is('.letter_index')) && (!checkElement.is(':visible'))) { $('#' + parent + ' .letter_index:visible').slideUp('normal'); if ((String(parent).length > 0) && (String(this.className).length > 0)) { // not working - wie ändert man das this.class so das die class abc oder def gesetzt wird anstatt this?! $.cookie(parent, this.className); } checkElement.slideDown('normal'); return false; } } ); } $(document).ready(function() {initMenus();}); This is how my HTML looks (sample item): <div id="sidebar"> <h2 class="letter_head"><a href="#" class="ABC">ABC</h2> <ul class="letter_index"> <li>Abc</li> <li>Bcd</li> </ul> </div> I can't find the problem why the script won't work in Safari and Chrome. I also don't know how to tell it to use the class of the a inside the h2 as the cookie value. (Currently the cookie is set as $.cookie(parent, this.className);, which produces cookies with the name container (the div above #sidebar) and the value letter_head. It needs to be something like 'sidebar' and 'ABC', 'DEF' and so on. Thanks in advance!

    Read the article

  • How do I display a jquery dialog box before the entire page is loaded?

    - by obarshay
    On my site a number of operations can take a long time to complete. When I know a page will take a while to load, I would like to display a progress indicator while the page is loading. Ideally I would like to say something along the lines of: $("#dialog").show("progress.php"); and have that overlay on top of the page that is being loaded (disappearing after the operation is completed). Coding the progress bar and displaying progress is not an issue, the issue is getting a progress indicator to pop up WHILE the page is being loaded. I have been trying to use JQuery's dialogs for this but they only appear after the page is already loaded. This has to be a common problem but I am not familiar enough with JavaScript to know the best way to do this. Here's simple example to illustrate the problem. The code below fails to display the dialog box before the 20 second pause is up. I have tried in Chrome and Firefox. In fact I don't even see the "Please Wait..." text. Here's the code I am using: <html> <head> <link type="text/css" href="http://jqueryui.com/latest/themes/base/ui.all.css" rel="stylesheet" /> <script type="text/javascript" src="http://jqueryui.com/latest/jquery-1.3.2.js"></script> <script type="text/javascript" src="http://jqueryui.com/latest/ui/ui.core.js"></script> <script type="text/javascript" src="http://jqueryui.com/latest/ui/ui.dialog.js"></script> </head> <body> <div id="please-wait">My Dialog</div> <script type="text/javascript"> $("#please-wait").dialog(); </script> <?php flush(); echo "Waiting..."; sleep(20); ?> </body> </html>

    Read the article

  • PHP miniwebsever file download

    - by snikolov
    $httpsock = @socket_create_listen("9090"); if (!$httpsock) { print "Socket creation failed!\n"; exit; } while (1) { $client = socket_accept($httpsock); $input = trim(socket_read ($client, 4096)); $input = explode(" ", $input); $input = $input[1]; $fileinfo = pathinfo($input); switch ($fileinfo['extension']) { default: $mime = "text/html"; } if ($input == "/") { $input = "index.html"; } $input = ".$input"; if (file_exists($input) && is_readable($input)) { echo "Serving $input\n"; $contents = file_get_contents($input); $output = "HTTP/1.0 200 OK\r\nServer: APatchyServer\r\nConnection: close\r\nContent-Type: $mime\r\n\r\n$contents"; } else { //$contents = "The file you requested doesn't exist. Sorry!"; //$output = "HTTP/1.0 404 OBJECT NOT FOUND\r\nServer: BabyHTTP\r\nConnection: close\r\nContent-Type: text/html\r\n\r\n$contents"; function openfile() { $filename = "a.pl"; $file = fopen($filename, 'r'); $filesize = filesize($filename); $buffer = fread($file, $filesize); $array = array("Output"=$buffer,"filesize"=$filesize,"filename"=$filename); return $array; } $send = openfile(); $file = $send['filename']; $filesize = $send['filesize']; $output = 'HTTP/1.0 200 OK\r\n'; $output .= "Content-type: application/octet-stream\r\n"; $output .= 'Content-Disposition: attachment; filename="'.$file.'"\r\n'; $output .= "Content-Length:$filesize\r\n"; $output .= "Accept-Ranges: bytes\r\n"; $output .= "Cache-Control: private\n\n"; $output .= $send['Output']; $output .= "Content-Transfer-Encoding: binary"; $output .= "Connection: Keep-Alive\r\n"; } socket_write($client, $output); socket_close ($client); } socket_close ($httpsock); Hello, I am snikolov i am creating a miniwebserver with php and i would like to know how i can send the client a file to download with his browser such as firefox or internet explore i am sending a file to the user to download via sockets, but the cleint is not getting the filename and the information to download can you please help me here,if i declare the file again i get this error in my server Fatal error: Cannot redeclare openfile() (previously declared in C:\User s\fsfdsf\sfdsfsdf\httpd.php:31) in C:\Users\hfghfgh\hfghg\httpd.php on li ne 29, if its possible, i would like to know if the webserver can show much banwdidth the user request via sockets, perl has the same option as php but its more hardcore than php i dont understand much about perl, i even saw that a miniwebserver can show much the client user pulls from the server would it be possible that you can assist me with this coding, i much aprreciate it thank you guys.

    Read the article

  • FFmpeg creates emtpy (black) frames

    - by resamsel
    I have a set of images from a timelapse shot (172 JPG files) that I want to convert into a movie. I tried several parameters with FFmpeg, but all I get is a video with black frames (though it has the expected length). ffmpeg -f image2 -vcodec mjpeg -y -i img_%03d.jpg timelapse2.mpg The command above creates this video: http://sdm-net.org/data/timelapse2.mpg What I'm expecting is something like this (created with Time Lapse Assembler.app): https://vimeo.com/39038362 - This is my fallback option, but I'd really like to create timelapse movies from a script. I'm on OSX Lion (10.7.3) with FFmpeg version (0.10) installed via Homebrew. I also tried to find a proper version of mencoder for OSX, but this doesn't seem to be an easy task. Also, ImageMagick's convert doesn't seem to work nicely, it creates really bad output and it seems there's not much I can do about it... Edit: With libx264 and an mp4 container: ffmpeg -f image2 -y -i img_%03d.jpg -vcodec libx264 timelapse4.mp4 Output: ffmpeg version 0.10 Copyright (c) 2000-2012 the FFmpeg developers built on Mar 26 2012 13:47:02 with clang 3.0 (tags/Apple/clang-211.12) configuration: --prefix=/usr/local/Cellar/ffmpeg/0.10 --enable-shared --enable-gpl --enable-version3 --enable-nonfree --enable-hardcoded-tables --enable-libfreetype --cc=/usr/bin/clang --enable-libx264 --enable-libfaac --enable-libmp3lame --enable-librtmp --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libxvid --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libass --disable-ffplay libavutil 51. 34.101 / 51. 34.101 libavcodec 53. 60.100 / 53. 60.100 libavformat 53. 31.100 / 53. 31.100 libavdevice 53. 4.100 / 53. 4.100 libavfilter 2. 60.100 / 2. 60.100 libswscale 2. 1.100 / 2. 1.100 libswresample 0. 6.100 / 0. 6.100 libpostproc 52. 0.100 / 52. 0.100 Input #0, image2, from 'img_%03d.jpg': Duration: 00:00:06.88, start: 0.000000, bitrate: N/A Stream #0:0: Video: mjpeg, yuvj420p, 3888x2592 [SAR 72:72 DAR 3:2], 25 fps, 25 tbr, 25 tbn, 25 tbc [buffer @ 0x7f8ec9415f20] w:3888 h:2592 pixfmt:yuvj420p tb:1/1000000 sar:72/72 sws_param: [libx264 @ 0x7f8ec981d800] using SAR=1/1 [libx264 @ 0x7f8ec981d800] frame MB size (243x162) > level limit (36864) [libx264 @ 0x7f8ec981d800] MB rate (984150) > level limit (983040) [libx264 @ 0x7f8ec981d800] using cpu capabilities: MMX2 SSE2Fast SSSE3 FastShuffle SSE4.2 AVX [libx264 @ 0x7f8ec981d800] profile High, level 5.1 [libx264 @ 0x7f8ec981d800] 264 - core 120 - H.264/MPEG-4 AVC codec - Copyleft 2003-2011 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00 Output #0, mp4, to 'timelapse4.mp4': Metadata: encoder : Lavf53.31.100 Stream #0:0: Video: h264 (![0][0][0] / 0x0021), yuvj420p, 3888x2592 [SAR 72:72 DAR 3:2], q=-1--1, 25 tbn, 25 tbc Stream mapping: Stream #0:0 -> #0:0 (mjpeg -> libx264) Press [q] to stop, [?] for help frame= 172 fps= 18 q=-1.0 Lsize= 259kB time=00:00:06.80 bitrate= 312.3kbits/s video:256kB audio:0kB global headers:0kB muxing overhead 1.089647% [libx264 @ 0x7f8ec981d800] frame I:1 Avg QP: 9.60 size:212820 [libx264 @ 0x7f8ec981d800] frame P:43 Avg QP:30.50 size: 291 [libx264 @ 0x7f8ec981d800] frame B:128 Avg QP:31.00 size: 285 [libx264 @ 0x7f8ec981d800] consecutive B-frames: 0.6% 0.0% 1.7% 97.7% [libx264 @ 0x7f8ec981d800] mb I I16..4: 22.5% 77.2% 0.3% [libx264 @ 0x7f8ec981d800] mb P I16..4: 0.0% 0.0% 0.0% P16..4: 0.0% 0.0% 0.0% 0.0% 0.0% skip:100.0% [libx264 @ 0x7f8ec981d800] mb B I16..4: 0.0% 0.0% 0.0% B16..8: 0.0% 0.0% 0.0% direct: 0.0% skip:100.0% L0: 1.2% L1:98.8% BI: 0.0% [libx264 @ 0x7f8ec981d800] 8x8 transform intra:77.2% inter:100.0% [libx264 @ 0x7f8ec981d800] coded y,uvDC,uvAC intra: 41.2% 23.4% 0.6% inter: 0.0% 0.0% 0.0% [libx264 @ 0x7f8ec981d800] i16 v,h,dc,p: 40% 25% 35% 1% [libx264 @ 0x7f8ec981d800] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 36% 32% 30% 1% 0% 0% 0% 0% 0% [libx264 @ 0x7f8ec981d800] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 51% 40% 6% 1% 1% 0% 1% 0% 1% [libx264 @ 0x7f8ec981d800] i8c dc,h,v,p: 60% 21% 19% 0% [libx264 @ 0x7f8ec981d800] Weighted P-Frames: Y:0.0% UV:0.0% [libx264 @ 0x7f8ec981d800] ref P L0: 92.3% 0.0% 0.0% 7.7% [libx264 @ 0x7f8ec981d800] ref B L0: 50.0% 0.0% 50.0% [libx264 @ 0x7f8ec981d800] ref B L1: 99.4% 0.6% [libx264 @ 0x7f8ec981d800] kb/s:304.49 Output timelapse4.mp4 (beacause of spam protection I can only post two links with my reputation): http sdm-net.org/data/timelapse4.mp4

    Read the article

  • HTML5 <audio> Safari live broadcast vs not

    - by Peter Parente
    I'm attempting to embed an HTML5 audio element pointing to MP3 or OGG data served by a PHP file . When I view the page in Safari, the controls appear, but the UI says "Live Broadcast." When I click play, the audio starts as expected. Once it ends, however, I can't start it playing again by clicking play. Even using the JS API on the audio element and setting currentTime to 0 fails with an index error exception. I suspected the headers from the PHP script were the problem, particularly missing a content length. But that's not the case. The response headers include a proper Content- Length to indicate the audio has finite size. Furthermore, everything works as expected in Firefox 3.5+. I can click play on the audio element multiple times to hear the sound replay. If I remove the PHP script from the equation and serve up a static copy of the MP3 file, everything works fine in Safari. Does this mean Safari is treating audio src URLs with query parameters differently than URLs that don't have them? Anyone have any luck getting this to work? My simple example page is: <!DOCTYPE html> <html> <head></head> <body> <audio controls autobuffer> <source src="say.php?text=this%20is%20a%20test&format=.ogg" /> <source src="say.php?text=this%20is%20a%20test&format=.mp3" /> </audio> </body> </html> HTTP Headers from PHP script: HTTP/1.x 200 OK Date: Sun, 03 Jan 2010 15:39:34 GMT Server: Apache X-Powered-By: PHP/5.2.10 Content-Length: 8993 Keep-Alive: timeout=2, max=98 Connection: Keep-Alive Content-Type: audio/mpeg HTTP Headers from direct file access: HTTP/1.x 200 OK Date: Sun, 03 Jan 2010 20:06:59 GMT Server: Apache Last-Modified: Sun, 03 Jan 2010 03:20:02 GMT Etag: "a404b-c3f-47c3a14937c80" Accept-Ranges: bytes Content-Length: 8993 Keep-Alive: timeout=2, max=100 Connection: Keep-Alive Content-Type: audio/mpeg I tried hard-coding the Accept-Ranges header into the script too, but no luck.

    Read the article

  • XSL file handling through javascript

    - by Zaid Iqbal
    i want to handle my xsl file through my javascript code. I made my XSL file but i want to dynamically change my XSL file at run time.As in add more attributes in header or data. My javascript code as follow` <script> function loadXMLDoc(dname) { if (window.XMLHttpRequest) { xhttp=new XMLHttpRequest(); } else { xhttp=new ActiveXObject("Microsoft.XMLHTTP"); } xhttp.open("GET",dname,false); xhttp.send(""); return xhttp.responseXML; } function displayResult() { xml=loadXMLDoc("cdcatalog.xml"); xsl=loadXMLDoc("cdcatalog.xsl"); // code for IE if (window.ActiveXObject) { ex=xml.transformNode(xsl); document.getElementById("example").innerHTML=ex; } // code for Mozilla, Firefox, Opera, etc. else if (document.implementation && document.implementation.createDocument) { xsltProcessor=new XSLTProcessor(); xsltProcessor.importStylesheet(xsl); resultDocument = xsltProcessor.transformToFragment(xml,document); document.getElementById("example").appendChild(resultDocument); } } </script> My XSL file code: <?xml version="1.0" encoding="ISO-8859-1"?> <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:template match="/"> <html> <body> <h2>My CD Collection</h2> <table border="1"> <tr bgcolor="#9acd32"> <th align="left">Title</th> <th align="left">Artist</th> <th align="left">Country</th> </tr> <xsl:for-each select="catalog/cd"> <tr> <td><xsl:value-of select="title" /></td> <td><xsl:value-of select="artist" /></td> <td><xsl:value-of select="country" /></td> </tr> </xsl:for-each> </table> </body> </html> </xsl:template> </xsl:stylesheet> `

    Read the article

  • UTF-8 GET using Indy 10.5.8.0 and Delphi XE2

    - by Bogdan Botezatu
    I'm writing my first Unicode application with Delphi XE2 and I've stumbled upon an issue with GET requests to an Unicode URL. Shortly put, it's a routine in a MP3 tagging application that takes a track title and an artist and queries Last.FM for the corresponding album, track no and genre. I have the following code: function GetMP3Info(artist, track: string) : TMP3Data //<---(This is a record) var TrackTitle, ArtistTitle : WideString; webquery : WideString; [....] WebQuery := UTF8Encode('http://ws.audioscrobbler.com/2.0/?method=track.getcorrection&api_key=' + apikey + '&artist=' + artist + '&track=' + track); //[processing the result in the web query, getting the correction for the artist and title] // eg: for artist := Bucovina and track := Mestecanis, the corrected values are //ArtistTitle := Bucovina; // TrackTitle := Mestecani?; //Now here is the tricky part: webquery := UTF8Encode('http://ws.audioscrobbler.com/2.0/?method=track.getInfo&api_key=' + apikey + '&artist=' + unescape(ArtistTitle) + '&track=' + unescape(TrackTitle)); //the unescape function replaces spaces (' ') with '+' to comply with the last.fm requests [some more processing] end; The webquery looks in a TMemo just right (http://ws.audioscrobbler.com/2.0/?method=track.getInfo&api_key=e5565002840xxxxxxxxxxxxxx23b98ad&artist=Bucovina&track=Mestecani?) Yet, when I try to send a GET() to the webquery using IdHTTP (with the ContentEncoding property set to 'UTF-8'), I see in Wireshark that the component is GET-ing the data to the ANSI value '/2.0/?method=track.getInfo&api_key=e5565002840xxxxxxxxxxxxxx23b98ad&artist=Bucovina&track=Mestec?ni?' Here is the full headers for the GET requests and responses: GET /2.0/?method=track.getInfo&api_key=e5565002840xxxxxxxxxxxxxx23b98ad&artist=Bucovina&track=Mestec?ni? HTTP/1.1 Content-Encoding: UTF-8 Host: ws.audioscrobbler.com Accept: text/html, */* Accept-Encoding: identity User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.23) Gecko/20110920 Firefox/3.6.23 SearchToolbar/1.22011-10-16 20:20:07 HTTP/1.0 400 Bad Request Date: Tue, 09 Oct 2012 20:46:31 GMT Server: Apache/2.2.22 (Unix) X-Web-Node: www204 Access-Control-Allow-Origin: * Access-Control-Allow-Methods: POST, GET, OPTIONS Access-Control-Max-Age: 86400 Cache-Control: max-age=10 Expires: Tue, 09 Oct 2012 20:46:42 GMT Content-Length: 114 Connection: close Content-Type: text/xml; charset=utf-8; <?xml version="1.0" encoding="utf-8"?> <lfm status="failed"> <error code="6"> Track not found </error> </lfm> The question that puzzles me is am I overseeing anything related to setting the property of the tidhttp control? How can I stop the well-formated URL i'm composing in the application from getting wrongfully sent to the server? Thanks.

    Read the article

  • dojo layout tutorial dor version 1.7 doesn't work for 1.7.2

    - by Sheena
    This is sortof a continuation to dojo1.7 layout acting screwy. So I made some working widgets and tested them out, i then tried altering my work using the tutorial at http://dojotoolkit.org/documentation/tutorials/1.7/dijit_layout/ to make the layout nice. After failing at that in many interesting ways (thus my last question) I started on a new path. My plan is now to implement the layout tutorial example and then stick in my widgets. For some reason even following the tutorial wont work... everything loads then disappears and I'm left with a blank browser window. Any ideas? It just struck me that it could be browser compatibility issues, I'm working on Firefox 13.0.1. As far as I know Dojo is supposed to be compatible with this... anyway, have some code: HTML: <body class="claro"> <div id="appLayout" class="demoLayout" data-dojo-type="dijit.layout.BorderContainer" data-dojo-props="design: 'headline'"> <div class="centerPanel" data-dojo-type="dijit.layout.ContentPane" data-dojo-props="region: 'center'"> <div> <h4>Group 1 Content</h4> <p>stuff</p> </div> <div> <h4>Group 2 Content</h4> </div> <div> <h4>Group 3 Content</h4> </div> </div> <div class="edgePanel" data-dojo-type="dijit.layout.ContentPane" data-dojo-props="region: 'top'"> Header content (top) </div> <div id="leftCol" class="edgePanel" data-dojo-type="dijit.layout.ContentPane" data-dojo-props="region: 'left', splitter: true"> Sidebar content (left) </div> </div> </body> Dojo Configuration: var dojoConfig = { baseUrl: "${request.static_url('mega:static/js')}", //this is in a mako template tlmSiblingOfDojo: false, packages: [ { name: "dojo", location: "libs/dojo" }, { name: "dijit", location: "libs/dijit" }, { name: "dojox", location: "libs/dojox" }, ], parseOnLoad: true, has: { "dojo-firebug": true, "dojo-debug-messages": true }, async: true }; other js stuff: require(["dijit/layout/BorderContainer", "dijit/layout/TabContainer", "dijit/layout/ContentPane", "dojo/parser"]); css: html, body { height: 100%; margin: 0; overflow: hidden; padding: 0; } #appLayout { height: 100%; } #leftCol { width: 14em; }

    Read the article

  • Mootools 1.2.4 delegation not working in IE8...?

    - by michael
    Hey there everybody-- So I have a listbox next to a form. When the user clicks an option in the select box, I make a request for the related data, returned in a JSON object, which gets put into the form elements. When the form is saved, the request goes thru and the listbox is rebuilt with the updated data. Since it's being rebuilt I'm trying to use delegation on the listbox's parent div for the onchange code. The trouble I'm having is with IE8 (big shock) not firing the delegated event. I have the following HTML: <div id="listwrapper" class="span-10 append-1 last"> <select id="list" name="list" size="20"> <option value="86">Adrian Franklin</option> <option value="16">Adrian McCorvey</option> <option value="196">Virginia Thomas</option> </select> </div> and the following script to go with it: window.addEvent('domready', function() { var jsonreq = new Request.JSON(); $('listwrapper').addEvent('change:relay(select)', function(e) { alert('this doesn't fire in IE8'); e.stop(); var status= $('statuswrapper').empty().addClass('ajax-loading'); jsonreq.options.url = 'de_getformdata.php'; jsonreq.options.method = 'post'; jsonreq.options.data = {'getlist':'<?php echo $getlist ?>','pkey':$('list').value}; jsonreq.onSuccess = function(rObj, rTxt) { status.removeClass('ajax-loading'); for (key in rObj) { status.set('html','You are currently editing '+rObj['cname']); if ($chk($(key))) $(key).value = rObj[key]; } $('lalsoaccomp-yes').set('checked',(($('naccompkey').value > 0)?'true':'false')); $('lalsoaccomp-no').set('checked',(($('naccompkey').value > 0)?'false':'true')); } jsonreq.send(); }); }); (I took out a bit of unrelated stuff). So this all works as expected in firefox, but IE8 refuses to fire the delegated change event on the select element. If I attach the change function directly to the select, then it works just fine. Am I missing something? Does IE8 just not like the :relay? Sidenote: I'm very new to mootools and javascripting, etc, so if there's something that can be improved code-wise, please let me know too.. Thanks!

    Read the article

  • PHP- Curl Download Bandwidth Limiting

    - by snikolov
    Hey guys, I have been trying to limit the bandwidth with PHP. Can you please help here? I can't get the download rate to be limited with PHP. Thanks a million! function total_filesize($url) { $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, "$url"); curl_setopt($ch, CURLINFO_SPEED_DOWNLOAD,12); //ITS NOT WORKING! curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.11) ". "Gecko/20071127 Firefox/2.0.0.11"); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_NOBODY, true); $chStore = curl_exec($ch); $chError = curl_error($ch); $chInfo = curl_getinfo($ch); curl_close($ch); return $size = $chInfo['download_content_length']; } function __define_url($url) { $basename = basename($url); Define('filename',$basename); $define_file_size = total_filesize($url); Define('filesizes',$define_file_size); } function _download_file($url_file) { __define_url($url_file); // $range = "50000-60000"; $filesize = filesizes; $file = filename; header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename="'.$file.'"'); header('Content-Transfer-Encoding: binary'); header("Content-Length: $filesize"); $ch = curl_init(); curl_setopt($ch, CURLOPT_URL,"$url_file"); curl_setopt($ch, CURLOPT_BINARYTRANSFER,1); // curl_setopt($ch, CURLOPT_RANGE,$range); curl_exec($ch); curl_close($ch); } _download_file('http://rarlabs.com/rar/wrar393.exe');

    Read the article

  • JQuery $.ajax doesn't return anything, but only in Google Chrome!?

    - by Shawson
    Hi All, I'm hoping someone can help me with this as I'm at a loss. I'm trying to simply load a plain text file into a page at runtime using jquery- everything works fine in IE8 (8.0.7600.16385), Firefox 3.6.3, however in Google Chrome 5.0.375.55 the "data" comes back as nothing- i get an empty alert box. This is the code i'm using; <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>Animation Test</title> <script type="text/javascript" language="javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script> <script type="text/javascript" language="javascript"> $(document).ready(function () { $.ajax({ url: 'level1.txt', success: function (data) { alert(data); }, async: true, type: 'GET' }); }); </script> </head> <body> <canvas id="canvas" width="640" height="480"> Unsupported Browser </canvas> </body> </html> The file I'm loading in is a plain text file containing this; Central Cavern 100 O.........1.C....C...........1.O O................1.............O O..............................O O..............................O O......................B1..B...O O=============~~~~=~~~~========O O.............................1O O===...........................O O............A..OOO.B..........O O====...<<<<<<<<<<<<<<<<<<<<...O O............................==O O..............................O O..........B........OOO.....===O O....===============...........O O%............................XO O==============================O (Yes- it's the first level from Manic Miner! I'm making a javascript version using the html5 canvas to get my head around using it.) I'm at a total loss- it can't be the code because it runs in the other 2 browsers- is there an issue with jquery and this version of Chrome? Thanks for reading!! Shaw.

    Read the article

  • AngularJS: download pdf file from the server

    - by Bartosz Bialecki
    I want to download a pdf file from the web server using $http. I use this code which works great, my file only is save as a html file, but when I open it it is opened as pdf but in the browser. I tested it on Chrome 36, Firefox 31 and Opera 23. This is my angularjs code (based on this code): UserService.downloadInvoice(hash).success(function (data, status, headers) { var filename, octetStreamMime = "application/octet-stream", contentType; // Get the headers headers = headers(); if (!filename) { filename = headers["x-filename"] || 'invoice.pdf'; } // Determine the content type from the header or default to "application/octet-stream" contentType = headers["content-type"] || octetStreamMime; if (navigator.msSaveBlob) { var blob = new Blob([data], { type: contentType }); navigator.msSaveBlob(blob, filename); } else { var urlCreator = window.URL || window.webkitURL || window.mozURL || window.msURL; if (urlCreator) { // Try to use a download link var link = document.createElement("a"); if ("download" in link) { // Prepare a blob URL var blob = new Blob([data], { type: contentType }); var url = urlCreator.createObjectURL(blob); $window.saveAs(blob, filename); return; link.setAttribute("href", url); link.setAttribute("download", filename); // Simulate clicking the download link var event = document.createEvent('MouseEvents'); event.initMouseEvent('click', true, true, window, 1, 0, 0, 0, 0, false, false, false, false, 0, null); link.dispatchEvent(event); } else { // Prepare a blob URL // Use application/octet-stream when using window.location to force download var blob = new Blob([data], { type: octetStreamMime }); var url = urlCreator.createObjectURL(blob); $window.location = url; } } } }).error(function (response) { $log.debug(response); }); On my server I use Laravel and this is my response: $headers = array( 'Content-Type' => $contentType, 'Content-Length' => strlen($data), 'Content-Disposition' => $contentDisposition ); return Response::make($data, 200, $headers); where $contentType is application/pdf and $contentDisposition is attachment; filename=" . basename($fileName) . '"' $filename - e.g. 59005-57123123.PDF My response headers: Cache-Control:no-cache Connection:Keep-Alive Content-Disposition:attachment; filename="159005-57123123.PDF" Content-Length:249403 Content-Type:application/pdf Date:Mon, 25 Aug 2014 15:56:43 GMT Keep-Alive:timeout=3, max=1 What am I doing wrong?

    Read the article

  • xml appending issue - in ie, chrome browsers

    - by 3gwebtrain
    Hi, i am using this coding for my xml information to append in to html. As well it works fine. but in the ie7,ie8 as well chrome browser it's not propelry. This code work9ing well with firefox,opera, safari.. i unable to find, what is the mistake i made this.. any one help me please? $(function(){ var thisPage; var parentPage; $('ul.left-navi li a').each(function(){ $('ul.left-navi li a').removeClass('current'); var pathname = (window.location.pathname.match(/[^\/]+$/)[0]); var currentPage = $(this).attr('href'); var pathArr = new Array(); pathArr = pathname.split("."); var file = pathArr[pathArr.length - 2]; thisPage = file; if(currentPage==pathname){ $(this).addClass("active"); } }) $.get('career-utility.xml',function(myData){ var receivedData = myData; var myXml = $(myData).find(thisPage); parentPage = thisPage; var overviewTitle = myXml.find('overview').attr('title'); var description = myXml.find('discription').text(); var mainsublinkTitle = myXml.find('mainsublink').attr('title'); var thisTitle = myXml.find("intro").attr('title'); var thisIntro = myXml.find("introinfo").text(); $('<h3>'+overviewTitle+'</h3>').appendTo('.overViewInfo'); $('<p>'+description+'</p>').appendTo('.overViewInfo'); var sublinks = myXml.find('mainsublink').children('sublink'); $('#intro h3').append(thisTitle); $('#intro').append(thisIntro); sublinks.each(function(numsub){ var newSubLink = $(this); var sublinkPage = $(this).attr('pageto'); var linkInfo = $(this).text(); $('ul.career-link').append('<li><a href="'+sublinkPage+'">'+linkInfo+'</a></li>'); }) // alert('thisTitle : '+thisTitle+'thisIntro :'+thisIntro); $(myXml).find('listgroup').each(function(index){ var count = index; var listGroup = $(this); var listGroupTitle = $(this).attr('title'); var shortNote = $(this).attr('shortnote'); var subLink = $(this).find('sublist'); var firstList = $(this).find('list'); $('.grouplist').append('<div class="list-group"><h3>'+listGroupTitle+'</h3><ul class="level-one level' + count + '"></ul></div>'); firstList.each(function(listnum) { $(this).wrapInner('<li>') .find('sublistgroup').wrapInner('<ul>').children().unwrap() .find('sublist').wrapInner('<li>').children().unwrap(); // Append content of 'list' node $('ul.level'+count).append($(this).children()); }); }); }); })

    Read the article

  • dijit/form/Select broken in Internet Explorer using Esri Javascript 3.7

    - by disuse
    After developing a web map app in Firefox, I tested my code in Internet Explorer (company standard) to discover that the dijit/form/Select is misbehaving using the latest Esri JavaScript v3.7. The issue I am seeing is that the Select will not update/change from the first option in the list when using v3.7. If I bump the version down to 3.6, it works as expected. I've tried IE browser modes from 7 to 10 and am experiencing the same behavior between all of them. Can someone confirm they are experiencing the same thing? Example in 3.7 - http://jsbin.com/aVIsApO/1/edit Example in 3.6 - http://jsbin.com/odIxETu/7/edit Codeblock var url = "http://services.arcgis.com/V6ZHFr6zdgNZuVG0/ArcGIS/rest/services/Street_Trees/FeatureServer/0"; var frmTrees; require([ "esri/tasks/query", "esri/tasks/QueryTask", "dojo/dom-construct", "dijit/form/Select", "dojo/parser", "dijit/registry", "dojo/on", "dojo/ready", "dojo/_base/connect", "dojo/domReady!" ], function( Query, QueryTask, domConstruct, Select, parser, registry, on, ready, connect ) { ready(function() { frmTrees = registry.byId("trees"); var qt = new QueryTask(url); var query = new Query(); query.where = "FID < 25"; query.orderByFields = ["qSpecies"]; query.returnGeometry = false; query.outFields = ["qSpecies", "TreeID"]; query.groupByFieldsForStatistics = ["qSpecies"]; //query.returnDistinctValues = true; qt.execute(query, function(results) { //var frm_domain_area = dom.byId("domain_area"); var testVals = {}; for (var i = 0; i < results.features.length; i++) { var id = results.features[i].attributes.TreeID; var desc = results.features[i].attributes.qSpecies; if (!testVals[id]) { testVals[id] = true; var selectElem = domConstruct.create("option",{ label: desc + " (" + id + ")", value: id }); frmTrees.addOption(selectElem); } } }); frmTrees.on("change", function() { console.debug(frmTrees.get("value")); }); }); });

    Read the article

  • Can a Site get a Virus from using Curl?

    - by Mark Tyler
    I have a script which uses simple php curl requests to get the contents from rss/atom feeds.... now my question is it possible that by using curl, is there a chance i might get a virus? Let's say I do a php curl request to a rss feed in feedburner (I know this site does not contain any viruses, but this is only an example) and let's say this site has a malicious virus of some kind. Is there a chance that I might inherit that virus too? If yes, what precautions can I do to make sure something like that never happens. This is the php code I am currently using to fetch the RSS $headers [] = 'Connection: Keep-Alive'; $headers [] = 'Content-type: application/x-www-form-urlencoded; charset=utf-8'; $headers [] = 'Accept-Encoding: application/xhtml+xml,application/xml,text/xml,text/html;q=0.9,*/*;q=0.8'; $ch = curl_init($url); //curl_setopt($ch, CURLOPT_USERAGENT,'Phenoix/0.1.3 (Feed Parser Beta; Beta ; Allow like Gecko) Build/20111112'); curl_setopt($ch, CURLOPT_USERAGENT,'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:9.0.1) Gecko/20100101 Firefox/9.0.1'); //curl_setopt($ch, CURLOPT_REFERER, "http://google.com/auto/clogger"); //curl_setopt($ch, CURLOPT_ENCODING, 'gzip, deflate' ); curl_setopt($ch, CURLOPT_HTTPHEADER, $headers ); //curl_setopt($channel, CURLOPT_HEADER, 0); curl_setopt($ch, CURLINFO_HEADER_OUT, 1); curl_setopt($ch, CURLOPT_VERBOSE, true); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_TIMEOUT, 10); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10); curl_setopt($ch, CURLOPT_FAILONERROR, 1); //allow cookie $cookie_file = "cookie1.txt"; curl_setopt($ch, CURLOPT_COOKIESESSION, true); curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie_file); curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie_file); $xml = curl_exec($ch); if(curl_error($ch)){ //$text .= "Error while updating. Please try again later"; return array(0, curl_error($ch)); } $info = curl_getinfo($ch); curl_close($ch);

    Read the article

  • xml append issue in ie,chrome

    - by 3gwebtrain
    Hi, I am creating a html page, using xml data. in which i am using the following function. It works fine with firefox,opera,safari. but in case of ie7,ie8, and chrome the data what i am getting from xml, is not appending properly. any one help me to solve this issue? in case any special thing need to concentrate on append funcation as well let me know.. $(function(){ var thisPage; var parentPage; $('ul.left-navi li a').each(function(){ $('ul.left-navi li a').removeClass('current'); var pathname = (window.location.pathname.match(/[^\/]+$/)[0]); var currentPage = $(this).attr('href'); var pathArr = new Array(); pathArr = pathname.split("."); var file = pathArr[pathArr.length - 2]; thisPage = file; if(currentPage==pathname){ $(this).addClass("active"); } }) $.get('career-utility.xml',function(myData){ var myXml = $(myData).find(thisPage); parentPage = thisPage; var overviewTitle = myXml.find('overview').attr('title'); var description = myXml.find('discription').text(); var mainsublinkTitle = myXml.find('mainsublink').attr('title'); var thisTitle = myXml.find("intro").attr('title'); var thisIntro = myXml.find("introinfo").text(); $('<h3>'+overviewTitle+'</h3>').appendTo('.overViewInfo'); $('<p>'+description+'</p>').appendTo('.overViewInfo'); var sublinks = myXml.find('mainsublink').children('sublink'); $('#intro h3').append(thisTitle); $('#intro').append(thisIntro); sublinks.each(function(numsub){ var newSubLink = $(this); var sublinkPage = $(this).attr('pageto'); var linkInfo = $(this).text(); $('ul.career-link').append('<li><a href="'+sublinkPage+'">'+linkInfo+'</a></li>'); }) $(myXml).find('listgroup').each(function(index){ var count = index; var listGroup = $(this); var listGroupTitle = $(this).attr('title'); var shortNote = $(this).attr('shortnote'); var subLink = $(this).find('sublist'); var firstList = $(this).find('list'); $('.grouplist').append('<div class="list-group"><h3>'+listGroupTitle+'</h3><ul class="level-one level' + count + '"></ul></div>'); firstList.each(function(listnum) { $(this).wrapInner('<li>') .find('sublistgroup').wrapInner('<ul>').children().unwrap() .find('sublist').wrapInner('<li>').children().unwrap(); $('ul.level'+count).append($(this).children()); }); }); }); }) Thanks for advance..

    Read the article

  • [c++] upload image to imageshack

    - by cinek1lol
    Hi! I would like to send pictures via a program written in C + +. - OK WinExec("C:\\curl\\curl.exe -H Expect: -F \"fileupload=@C:\\curl\\ok.jpg\" -F \"xml=yes\" -# \"http://www.imageshack.us/index.php\" -o data.txt -A \"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1\" -e \"http://www.imageshack.us\"", NULL); It works, but I would like to send the pictures from pre-loaded carrier to a variable char (you know what I mean? First off, I load the pictures into a variable and then send the variable), cause now I have to specify the path of the picture on a disk. I wanted to write this program in c++ by using the curl library, not through exe. extension. I have also found such a program (which has been modified by me a bit) #include <stdio.h> #include <string.h> #include <iostream> #include <curl/curl.h> #include <curl/types.h> #include <curl/easy.h> int main(int argc, char *argv[]) { CURL *curl; CURLcode res; struct curl_httppost *formpost=NULL; struct curl_httppost *lastptr=NULL; struct curl_slist *headerlist=NULL; static const char buf[] = "Expect:"; curl_global_init(CURL_GLOBAL_ALL); /* Fill in the file upload field */ curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "send", CURLFORM_FILE, "nowy.jpg", CURLFORM_END); curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "nowy.jpg", CURLFORM_COPYCONTENTS, "nowy.jpg", CURLFORM_END); curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "submit", CURLFORM_COPYCONTENTS, "send", CURLFORM_END); curl = curl_easy_init(); headerlist = curl_slist_append(headerlist, buf); if(curl) { curl_easy_setopt(curl, CURLOPT_URL, "http://www.imageshack.us/index.php"); if ( (argc == 2) && (!strcmp(argv[1], "xml=yes")) ) curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist); curl_easy_setopt(curl, CURLOPT_HTTPPOST, formpost); res = curl_easy_perform(curl); curl_easy_cleanup(curl); curl_formfree(formpost); curl_slist_free_all (headerlist); } system("pause"); return 0; }

    Read the article

  • IE browser caching and the jQuery Form Plugin

    - by Harfleur
    Like so many lost souls before me, I'm floundering in the snake pit that is Ajax form submission and IE browser caching. I'm trying to write a simple script using the jQuery Form Plugin to Ajaxify Wordpress comments. It's working fine in Firefox, Chrome, Safari, et. al., but in IE, the response text is cached with the result that Ajax is pulling in the wrong comment. jQuery(this).ajaxSubmit({ success: function(data) { var response = $("<ol>"+data+"</ol>"); response.find('.commentlist li:last').hide().appendTo(jQuery('.commentlist')).slideDown('slow'); } }); ajaxSubmit sends the comment to wp-comments-post.php, which inelegantly spits back the entire page as a response. So, despite the fact that it's ugly as toads, I'm sticking the response text in a variable, using :last to isolate the most recent comment, and sliding it down in its place. IE, however, is returning the cached version of the page, which doesn't include the new comment. So ".commentlist li:last" selects the previous comment, a duplicate of which then uselessly slides down beneath the original. I've tried setting "cache: false" in the ajaxSubmit options, but it has no effect. I've tried setting a url option and tacking on a random number or timestamp, but it winds up being attached to the POST that submits the comment to the server rather than the GET that returns the response, and so has no effect. I'm not sure what else to try. Everything works fine in IE if I turn off browser caching, but that's obviously not something I can expect anyone viewing the page to do. Any help will be hugely appreciated. Thanks in advance! EDIT WITH A PROGRESS REPORT: A couple of people have suggested using PHP headers to prevent caching, and this does indeed work. The trouble is that wp-comments-post is spitting back the entire page when a new comment is submitted, and the only way I can see to add headers is to put them in the Wordpress post template, which disables caching on all posts at all times--not quite the behavior I'm looking for. Is there a way to set a php conditional--"if is_ajax" or something like that--that would keep the headers from being applied during regular pageloads, but plug them in if the page was called by an Ajax GET?

    Read the article

  • Ie7 float problems and hiperlinks not clickable

    - by Uffo
    Markup <ul class="navigation clearfix"> <li class="navigation-top"></li> <div class="first-holder" style="height:153px;"> <dl class="hold-items clearfix"> <dd class="clearfix with"><a href="http://site.com" title="Protokoll">Protokoll</a></dd> <dd class="with-hover"><a href="http://site.com" title="Mein/e Unternehmen">Mein/e Unternehmen</a></dd> <dd class="with"><a class="face-me" href="http://site.com" title="Erweiterte Suche">Erweiterte Suche</a></dd> <dd class="with"><a href="http://site.com" title="Abmelden">Abmelden</a></dd> </dl> </div><!--[end] /.first-holder--> <li class="navigation-bottom"></li> </ul><!--[end] /.navigation--> Css: .first-holder{height:304px;position:relative;width:178px;overflow:hidden;margin-bottom:0px;padding-bottom: 0px;} .hold-items{top:0px;position:absolute;} .navigation dd.with{line-height:38px;background:url('/images/sprite.png') no-repeat -334px -46px;width:162px;height:38px;padding-bottom:0px;overflow: hidden;} .navigation dd.with a{position:relative;outline:0;display:block;font-weight:bold;color:#3f78c0;padding-left:10px;line-height:38px;} .with-hover{background:url('/images/sprite.png') no-repeat -505px -47px;width:178px;height:38px;line-height:38px;overflow:none;} .with-hover a{position:relative;display:block;font-weight:bold;color:#fff;padding-left:10px} .navigation-top{background:url('/images/sprite.png') no-repeat -694px -46px;width:160px;height:36px;} .navigation-top a{display:block;outline:0;height:20px;padding-top:18px;padding-left:138px;} .navigation-top a span{display:block;background:url('/images/sprite.png') no-repeat -212px -65px;width:8px;height:6px;} .navigation-bottom{background:url('/images/sprite.png') no-repeat -784px -402px;width:160px;height:37px;} .navigation-bottom a{display:block;outline:0;height:20px;padding-top:18px;padding-left:138px;} .navigation-bottom a span{display:block;background:url('/images/sprite.png') no-repeat -212px -74px;width:8px;height:6px;} Also the links, are not clickable, if I click on a link in IE7 it doesn't do the action..it doesn't redirect me to the location. This is how it looks in IE7: http://screencast.com/t/MGY4NjljZjc This is how it look in IE8,Firefox,Chrome and so on http://screencast.com/t/MzhhMDQ1M What I'm doing wrong PS: .navigation-top a span and .navigation-bottom a span I'm using some where else, but that it's ok it works fine.

    Read the article

  • CSC folder data access AND roaming profiles issues (Vista with Server 2003, then 2008)

    - by Alex Jones
    I'm a junior sysadmin for an IT contractor that helps small, local government agencies, like little towns and the like. One of our clients, a public library with ~ 50 staff users, was recently migrated from Server 2003 Standard to Server 2008 R2 Standard in a very short timeframe; our senior employee, the only network engineer, had suddenly put in his two weeks notice, so management pushed him to do this project before quitting. A bit hasty on management's part? Perhaps. Could we do anything about that? Nope. Do I have to fix this all by myself? Pretty much. The network is set up like this: a) 50ish staff workstations, all running Vista Business SP2. All staff use MS Outlook, which uses RPC-over-HTTPS ("Outlook Anywhere") for cached Exchange access to an offsite location. b) One new (virtualized) Server 2008 R2 Standard instance, running atop a Server 2008 R2 host via Hyper-V. The VM is the domain's DC, and also the site's one and only file server. Let's call that VM "NEWBOX". c) One old physical Server 2003 Standard server, running the same roles. Let's call it "OLDBOX". It's still on the network and accessible, but it's been demoted, and its shares have been disabled. No data has been deleted. c) Gigabit Ethernet everywhere. The organization's only has one domain, and it did not change during the migration. d) Most users were set up for a combo of redirected folders + offline files, but some older employees who had been with the organization a long time are still on roaming profiles. To sum up: the servers in question handle user accounts and files, nothing else (eg, no TS, no mail, no IIS, etc.) I have two major problems I'm hoping you can help me with: 1) Even though all domain users have had their redirected folders moved to the new server, and loggin in to their workstations and testing confirms that the Documents/Music/Whatever folders point to the new paths, it appears some users (not laptops or anything either!) had been working offline from OLDBOX for a long time, and nobody realized it. Here's the ugly implication: a bunch of their data now lives only in their CSC folders, because they can't access the share on OLDBOX and sync with it finally. How do I get this data out of those CSC folders, and onto NEWBOX? 2) What's the best way to migrate roaming profile users to non-roaming ones, without losing vital data like documents, any lingering PSTs, etc? Things I've thought about trying: For problem 1: a) Reenable the documents share on OLDBOX, force an Offline Files sync for ALL domain users, then copy OLDBOX's share's data to the equivalent share on NEWBOX. Reinitialize the Offline Files cache for every user. With this: How do I safely force a domain-wide Offline Files sync? Could I lose data by reenabling the share on OLDBOX and forcing the sync? Afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? b) Determine which users have unsynced changes to OLDBOX (again, how?), search each user's CSC folder domain-wide via workstation admin shares, and grab the unsynched data. Reinitialize the Offline Files cache for every user. With this: How can I detect which users have unsynched changes with a script? How can I search each user's CSC folder, when the ownership and permissions set for CSC folders are so restrictive? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? c) Manually visit each workstation, copy the contents of the CSC folder, and manually copy that data onto NEWBOX. Reinitialize the Offline Files cache for every user. With this: Again, how do I 'break into' the CSC folder and get to its data? As an experiment, I took one workstation's HD offsite, imaged it for safety, and then tried the following with one of our shop PCs, after attaching the drive: grant myself full control of the folder (failed), grant myself ownership of the folder (failed), run chkdsk on the whole drive to make sure nothing's messed up (all OK), try to take full control of the entire drive (failed), try to take ownership of the entire drive (failed) MS KB articles and Googling around suggests there's a utility called CSCCMD that's meant for this exact scenario...but it looks like it's available for XP, not Vista, no? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? For problem 2: a) Figure out which users are on roaming profiles, and where their profiles 'live' on the server. Create new folders for them in the redirected folders repository, migrate existing data, and disable the roaming. With this: Finding out who's roaming isn't hard. But what's the best way to disable the roaming itself? In AD Users and Computers, or on each user's workstation? Doing it centrally on the server seems more efficient; that said, all of the KB research I've done turns up articles on how to go from local to roaming, not the other way around, so I don't have good documentation on this. In closing: we have good backups of NEWBOX and OLDBOX, but not of the workstations themselves, so anything drastic on the client side would need imaging and testing for safety. Thanks for reading along this far! Hopefully you can help me dig us out of this mess.

    Read the article

  • Can't mass-assign protected attributes -- unsolved issue

    - by nfriend21
    I have read about 10 different posts here about this problem, and I have tried every single one and the error will not go away. So here goes: I am trying to have a nested form on my users/new page, where it accepts user-attributes and also company-attributes. When you submit the form: Here's what my error message reads: ActiveModel::MassAssignmentSecurity::Error in UsersController#create Can't mass-assign protected attributes: companies app/controllers/users_controller.rb:12:in `create' Here's the code for my form: <%= form_for @user do |f| %> <%= render 'shared/error_messages', object: f.object %> <%= f.fields_for :companies do |c| %> <%= c.label :name, "Company Name"%> <%= c.text_field :name %> <% end %> <%= f.label :name %> <%= f.text_field :name %> <%= f.label :email %> <%= f.text_field :email %> <%= f.label :password %> <%= f.password_field :password %> <%= f.label :password_confirmation %> <%= f.password_field :password_confirmation %> <br> <% if current_page?(signup_path) %> <%= f.submit "Sign Up", class: "btn btn-large btn-primary" %> Or, <%= link_to "Login", login_path %> <% else %> <%= f.submit "Update User", class: "btn btn-large btn-primary" %> <% end %> <% end %> Users Controller: class UsersController < ApplicationController def index @user = User.all end def new @user = User.new end def create @user = User.create(params[:user]) if @user.save session[:user_id] = @user.id #once user account has been created, a session is not automatically created. This fixes that by setting their session id. This could be put into Controller action to clean up duplication. flash[:success] = "Your account has been created!" redirect_to tasks_path else render 'new' end end def show @user = User.find(params[:id]) @tasks = @user.tasks end def edit @user = User.find(params[:id]) end def update @user = User.find(params[:id]) if @user.update_attributes(params[:user]) flash[:success] = @user.name.possessive + " profile has been updated" redirect_to @user else render 'edit' end #if @task.update_attributes params[:task] #redirect_to users_path #flash[:success] = "User was successfully updated." #end end def destroy @user = User.find(params[:id]) unless current_user == @user @user.destroy flash[:success] = "The User has been deleted." end redirect_to users_path flash[:error] = "Error. You can't delete yourself!" end end Company Controller class CompaniesController < ApplicationController def index @companies = Company.all end def new @company = Company.new end def edit @company = Company.find(params[:id]) end def create @company = Company.create(params[:company]) #if @company.save #session[:user_id] = @user.id #once user account has been created, a session is not automatically created. This fixes that by setting their session id. This could be put into Controller action to clean up duplication. #flash[:success] = "Your account has been created!" #redirect_to tasks_path #else #render 'new' #end end def show @comnpany = Company.find(params[:id]) end end User model class User < ActiveRecord::Base has_secure_password attr_accessible :name, :email, :password, :password_confirmation has_many :tasks, dependent: :destroy belongs_to :company accepts_nested_attributes_for :company validates :name, presence: true, length: { maximum: 50 } VALID_EMAIL_REGEX = /\A[\w+\-.]+@[a-z\d\-.]+\.[a-z]+\z/i validates :email, presence: true, format: { with: VALID_EMAIL_REGEX }, uniqueness: { case_sensitive: false } validates :password, length: { minimum: 6 } #below not needed anymore, due to has_secure_password #validates :password_confirmation, presence: true end Company Model class Company < ActiveRecord::Base attr_accessible :name has_and_belongs_to_many :users end Thanks for your help!!

    Read the article

  • Help with abstract class in Java with private variable of type List<E>

    - by Nazgulled
    Hi, It's been two years since I last coded something in Java so my coding skills are bit rusty. I need to save data (an user profile) in different data structures, ArrayList and LinkedList, and they both come from List. I want to avoid code duplication where I can and I also want to follow good Java practices. For that, I'm trying to create an abstract class where the private variables will be of type List<E> and then create 2 sub-classes depending on the type of variable. Thing is, I don't know if I'm doing this correctly, you can take a look at my code: Class: DBList import java.util.List; public abstract class DBList { private List<UserProfile> listName; private List<UserProfile> listSSN; public List<UserProfile> getListName() { return this.listName; } public List<UserProfile> getListSSN() { return this.listSSN; } public void setListName(List<UserProfile> listName) { this.listName = listName; } public void setListSSN(List<UserProfile> listSSN) { this.listSSN = listSSN; } } Class: DBListArray import java.util.ArrayList; public class DBListArray extends DBList { public DBListArray() { super.setListName(new ArrayList<UserProfile>()); super.setListSSN(new ArrayList<UserProfile>()); } public DBListArray(ArrayList<UserProfile> listName, ArrayList<UserProfile> listSSN) { super.setListName(listName); super.setListSSN(listSSN); } public DBListArray(DBListArray dbListArray) { super.setListName(dbListArray.getListName()); super.setListSSN(dbListArray.getListSSN()); } } Class: DBListLinked import java.util.LinkedList; public class DBListLinked extends DBList { public DBListLinked() { super.setListName(new LinkedList<UserProfile>()); super.setListSSN(new LinkedList<UserProfile>()); } public DBListLinked(LinkedList<UserProfile> listName, LinkedList<UserProfile> listSSN) { super.setListName(listName); super.setListSSN(listSSN); } public DBListLinked(DBListLinked dbListLinked) { super.setListName(dbListLinked.getListName()); super.setListSSN(dbListLinked.getListSSN()); } } 1) Does any of this make any sense? What am I doing wrong? Do you have any recommendations? 2) It would make more sense for me to have the constructors in DBList and calling them (with super()) in the subclasses but I can't do that because I can't initialize a variable with new List<E>(). 3) I was thought to do deep copies whenever possible and for that I always override the clone() method of my classes and code it accordingly. But those classes never had any lists, sets or maps on them, they only had strings, ints, floats. How do I do deep copies in this situation?

    Read the article

< Previous Page | 443 444 445 446 447 448 449 450 451 452 453 454  | Next Page >