Search Results

Search found 296 results on 12 pages for 'tex'.

Page 5/12 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • how to replace latex macros with their definitions (using latex)

    - by RamyenHead
    How can I replace all occurrence of user defined latex macros with their definitions? For example, given this file old.tex \newcommand{\blah}[2]{#1 \to #2} ... foo \blah{egg}{spam} bar ... how to generate the file below in an automatic way new.tex ... foo egg \to spam bar ... Instead of reimplementing latex macro logic with perl, can I use latex or tex engine itself to do this?

    Read the article

  • whats wrong with this jquery

    - by Gandalf StormCrow
    I'm getting syntax error in firebug here is the code : $('#moderator-attention').live('toogle', function(){ function () { $(".moderator-tex").show(); }, function () { $(".moderator-tex").hide(); } }); I want to create a toogle function, when button is clicked then textarea with class moderator-tex should appear .. and if other button is clicked then should be hidden ..

    Read the article

  • More efficient R / Sweave / TeXShop work-flow?

    - by user594795
    I've now got everything to work properly on my Mac OS X 10.6 machine so that I can create decent looking LaTeX documents with Sweave that include snippets of R code, output, and LaTeX formatting together. Unfortunately, I feel like my work-flow is a bit clunky and inefficient: Using TextWrangler, I write LaTeX code and R code (surrounded by <<= above and @ below R code chunk) together in one .Rnw file. After saving changes, I call the .Rnw file from R using the Sweave command Sweave(file="/Users/mymachine/Documents/Assign4.Rnw", syntax="SweaveSyntaxNoweb") In response, R outputs the following message: You can now run LaTeX on 'Assign4.tex' So then I find the .tex file (Assign4.tex) in the R directory and copy it over to the folder in my documents ~/Documents/ where the .Rnw file is sitting (to keep everything in one place). Then I open the .tex file (e.g. Assign4.tex) in TeXShop and compile it there into pdf format. It is only at this point that I get to see any changes I have made to the document and see if it 'looks nice'. Is there a way that I can compile everything with one button click? Specifically it would be nice to either call Sweave / R directly from TextWrangler or TeXShop. I suspect it might be possible to code a script in Terminal to do it, but I have no experience with Terminal. Please let me know if there's any other things I can do to streamline or improve my work flow.

    Read the article

  • Max number of nested \input in LaTeX/Beamer

    - by P. Henaff
    When using the beamer documentclass, it looks like the third level of nested input is ignored: \documentclass{beamer} \input{body} body.tex: \begin{document} \input{file1} \input{file2} \end{document} file2.tex: \input{file21} The content of file21.tex is ignored with documentclass beamer, but correctly inserted if I use a documentclass article, for example. Has anyone seen something like this?

    Read the article

  • "Untrusted packages could compromise your system's security." appears while trying to install anything

    - by maria
    Hi I've freshly installed Ubuntu 10.4 on a new computer. I'm trying to install on it application I need (my old computer is broken and I have to send it to the service). I've managed to install texlive and than I can't install anything else. All software I want to have is what I have succesfuly installed on my old computer (with the same version of Ubuntu), so I don't understand, why terminal says (sorry, the terminal talks half English, half Polish, but I hope it's enough): maria@marysia-ubuntu:~$ sudo aptitude install emacs Czytanie list pakietów... Gotowe Budowanie drzewa zaleznosci Odczyt informacji o stanie... Gotowe Reading extended state information Initializing package states... Gotowe The following NEW packages will be installed: emacs emacs23{a} emacs23-bin-common{a} emacs23-common{a} emacsen-common{a} 0 packages upgraded, 5 newly installed, 0 to remove and 0 not upgraded. Need to get 23,9MB of archives. After unpacking 73,8MB will be used. Do you want to continue? [Y/n/?] Y WARNING: untrusted versions of the following packages will be installed! Untrusted packages could compromise your system's security. You should only proceed with the installation if you are certain that this is what you want to do. emacs emacs23-bin-common emacsen-common emacs23-common emacs23 Do you want to ignore this warning and proceed anyway? To continue, enter "Yes"; to abort, enter "No" I was trying to install other editors as well, with the same result. As I decided that I might be sure that I know the package I want to install is secure, finaly I've entered "Yes". The installation ended succesfuly, but editor don't understand any .tex file (.tex files are for sure fine): this is pdfTeX, Version 3.1415926-1.40.10 (TeX Live 2009/Debian) restricted \write18 enabled. entering extended mode (./Szarfi.tex ! Undefined control sequence. l.2 \documentclass {book} ? What's more, I've realised that in Synaptic Manager there is no package which would be marked as supported by Canonical... Any tips? Thanks in advance

    Read the article

  • How to turn iptables stateless?

    - by tex
    Hi, I'm running a Linux server that - from time to time - faces heavy load and the conntrack table overflows. Since it's iptables firewall ruleset is very simple I'd like to turn it to stateless mode. I know that iptables can operate in stateful connection tracking mode and in a stateless mode. My firewall rules are all in place I'm pretty sure that they are stateless but my question is how can I verify that the firewall is really operating in stateless mode?

    Read the article

  • Why i can not load a simple pixel shader effect (. fx) file in xna?

    - by Mehdi Bugnard
    I just want to load a simple *.fx file into my project to make a (pixel shader) effect. But whenever I try to compile my project, I get the following error in visual studio Error List: Errors compiling .. ID3DXEffectCompiler: There were no techniques ID3DXEffectCompiler: Compilation failed I already searched on google and found many people with the same problem. And I realized that it was a problem of encoding. With the return lines unrecognized '\ n' . I tried to copy and paste to notepad and save as with ASCII or UTF8 encoding. But the result is always the same. Do you have an idea please ? Thanks a looot :-) Here is my [.fx] file : sampler BaseTexture : register(s0); sampler MaskTexture : register(s1) { addressU = Clamp; addressV = Clamp; }; //All of these variables are pixel values //Feel free to replace with float2 variables float MaskLocationX; float MaskLocationY; float MaskWidth; float MaskHeight; float BaseTextureLocationX; //This is where your texture is to be drawn float BaseTextureLocationY; //texCoord is different, it is the current pixel float BaseTextureWidth; float BaseTextureHeight; float4 main(float2 texCoord : TEXCOORD0) : COLOR0 { //We need to calculate where in terms of percentage to sample from the MaskTexture float maskPixelX = texCoord.x * BaseTextureWidth + BaseTextureLocationX; float maskPixelY = texCoord.y * BaseTextureHeight + BaseTextureLocationY; float2 maskCoord = float2((maskPixelX - MaskLocationX) / MaskWidth, (maskPixelY - MaskLocationY) / MaskHeight); float4 bitMask = tex2D(MaskTexture, maskCoord); float4 tex = tex2D(BaseTexture, texCoord); //It is a good idea to avoid conditional statements in a pixel shader if you can use math instead. return tex * (bitMask.a); //Alternate calculation to invert the mask, you could make this a parameter too if you wanted //return tex * (1.0 - bitMask.a); }

    Read the article

  • "VLC could not read the file" error when trying to play DVDs

    - by stephenmurdoch
    I can watch most DVD's on my machine using VLC but today, I went to watch Thor, and it won't play. libdvdread4 and libdvdcss2 are at the latest versions. vlc -v returns 1.1.4 w32codecs are installed and reinstalled ubuntu-restricted-extras are same as above My machine recognises the disc and I can open the folder and browse the assorted .vob files, of which there are many. None of them will open in VLC, or in MPlayer etc. When I run vlc -vvv /media/THOR/VIDEO_TS/VTS_03_1.VOB I get: File Reading Failed VLC could not read the file I also see command line output like this: [0x963f47c] main filter debug: removing module "swscale" [0x963a4b4] main generic debug: A filter to adapt decoder to display is needed [0x964be84] main filter debug: looking for video filter2 module: 18 candidates [0x964be84] swscale filter debug: 720x576 chroma: I420 -> 979x551 chroma: RV32 with scaling using Bicubic (good quality) [0x964be84] main filter debug: using video filter2 module "swscale" ..... [0x959f4e4] main video output warning: late picture skipped (-10038 > -15327) [0x963a4b4] main generic debug: auto hidding mouse [0x93ca094] main input warning: clock gap, unexpected stream discontinuity [0x93ca094] main input warning: feeding synchro with a new reference point trying to recover from clock gap [0x959f4e4] main video output warning: early picture skipped ...... ac-tex damaged at 0 12 ac-tex damaged at 6 20 ac-tex damaged at 12 28 This happens with onboard and Known Good USB DVD player I don't have standalone DVD player to try with TV I am going to watch another film instead for now, because I can do that. I just can't watch THOR, and I'm pretty confident that the disc is ok. It is a rental, but it's clean and there are no surface abrasions. I even cleaned it with Christian Dior aftershave to make sure.

    Read the article

  • What could cause a pixel shader to paint outside the lines of the vertex shader output?

    - by Rei Miyasaka
    From what I understand, the pixels that a pixel shader operates on are specified implicitly by the SV_POSITION output (in DirectX) of the vertex shader. What then could cause a pixel shader to render in the middle of nowhere? I used the new Visual Studio 2012 graphics debugger to visualize my vertex and pixel shader output. This is the output from a DrawIndexed() call that draws a cube: The pink part is the rendered output of the pixel shader, which takes the cube on its left as its input. The vertex shader code: cbuffer Buf { float4x4 final; }; struct In { float4 pos:POSITION; float3 norm:NORMAL; float2 texuv:TEXCOORD; }; struct Out { float4 col:COLOR; float2 tex:TEXCOORD; float4 pos:SV_POSITION; }; Out main(In input) { Out output; output.pos = mul(input.pos, final); output.col = float4(1.0f, 0.5f, 0.5f, 1.0f); output.tex = input.texuv; return output; } And the pixel shader: struct In { float4 col:COLOR; float2 tex:TEXCOORD; float4 pos:SV_POSITION; }; float4 main(In input) : SV_TARGET { return input.col; } The raster stage is the only thing between the vertex shader and the pixel shader, so my suspicion is that it's some raster stage settings. But the raster stage shouldn't change the shape of the vertex shader output so drastically, should it?

    Read the article

  • Recording slow web stream

    - by Budric
    I'm trying to record an mpeg2 video stream from a website that doesn't have the greatest bandwidth. The video often buffers. I want to download the stream and watch it offline. The extract stream format received is: Stream #0.0[0x44]: Audio: mp2, 48000 Hz, stereo, s16, 192 kb/s Stream #0.1[0x45]: Video: mpeg2video (Main), yuv420p, 704x576 [PAR 16:11 DAR 16:9], 15000 kb/s, 27.19 fps, 25 tbr, 90k tbn, 50 tbc I use the following tool to transocde the stream: ffmpeg -i "http://url" -y -vcodec libx264 -b 3000k -acodec copy /tmp/stream.mp4 Unfortunately after a few seconds ffmpeg stops recording with an error [mpegts @ 0x1f0b9c0] PES packet size mismatch [mp2 @ 0x1f14640] incomplete frame Error while decoding stream #0.0 [mpeg2video @ 0x1f16860] ac-tex damaged at 0 26 [mpeg2video @ 0x1f16860] Warning MVs not available I've tried encoding with vlc as well with similar issues. Although vlc doesn't stop encoding, the output video has regions where it hangs. vlc -I dummy "http://url" --network-caching="1000" --sout="#transcode{vcodec=h264,vb=3000,acodec=mp3,ab=192}:std{access=file,mux=mp4,dst=/tmp/stream.mp4}" [mpeg2video @ 0x7f2d4c001e20] ac-tex damaged at 9 33 [mpeg2video @ 0x7f2d4c001e20] Warning MVs not available [mpeg2video @ 0x7f2d4c001e20] concealing 132 DC, 132 AC, 132 MV errors [mpeg2video @ 0x7f2d4c001e20] ac-tex damaged at 16 17 [mpeg2video @ 0x7f2d4c001e20] Warning MVs not available [mpeg2video @ 0x7f2d4c001e20] concealing 836 DC, 836 AC, 836 MV errors libdvbpsi error (PSI decoder): TS discontinuity (received 4, expected 3) for PID 0 I also tried flv transcoding and it shows up with its own set of issues, like output flv file hangs in certain parts. Anyone know what's wrong or how to fix this?

    Read the article

  • Keeping files that are often changed in sync between desktop and laptop

    - by N.N.
    I'm looking for a way to keep a desktop and a laptop in sync. What I want to keep in sync are some folders, mainly ~/Documents, that are changed often when working on them. If it matters I can connect to my desktop from anywhere via an URL but my laptop is harder to access since it might be behind NAT and such. I have been looking at Ubuntu One but it seems to not go well with working on documents written in LaTeX. If I work on a .tex file in the Ubuntu One directory and compile it (with pdflatex) every now and then (as often as every 10 sec when working) it will create several new files including a pdf which are uploaded to Ubuntu One and this seems stupid since it will create continuous upload when working on .tex files. I also usually keep .tex documents version controlled by git and then every commit (which also can happen frequently) will cause upload (by changes in ./.git) so that it happens continuously when working. Another example is editing images that are saved often. What I think would be best is for sync to happen every tenth minute or at the end of every working session (but there might be some other way to handle this?).

    Read the article

  • Is my gedit-latex-plugin working properly?

    - by arroy_0209
    I have installed gedit-latex-plugin(0.2 rc3) to be used with gedit(2.30.3) in ubuntu 10.04. If I use the command gedit file.tex& in terminal the file is opened and it seems everything works fine but in the terminal, lots of comments appear, some of which are: 2012-03-31 22:14:27,263 DEBUG resources - Initializing resource locating 2012-03-31 22:14:27,361 DEBUG Preferences - not found 2012-03-31 22:14:27,373 DEBUG JobManager - Created JobManager instance 147209196 2012-03-31 22:14:27,379 DEBUG GeditLaTeXPlugin - activate 2012-03-31 22:14:27,379 DEBUG WindowContext - init 2012-03-31 22:14:27,444 DEBUG GeditWindowDecorator - _init_tab_decorators: initialized 0 decorators 2012-03-31 22:14:27,511 DEBUG GeditWindowDecorator - active_tab_changed 2012-03-31 22:14:27,511 DEBUG GeditWindowDecorator - ---------- ADJUST: None 2012-03-31 22:14:27,513 DEBUG GeditWindowDecorator - No window-scope views for this extension 2012-03-31 22:14:27,513 DEBUG GeditWindowDecorator - _set_selected_bottom_view: 0 2012-03-31 22:14:27,514 DEBUG GeditWindowDecorator - _set_selected_side_view: 0 2012-03-31 22:14:27,539 DEBUG GeditWindowDecorator - tab_added 2012-03-31 22:14:27,952 DEBUG GeditTabDecorator - loaded 2012-03-31 22:14:27,964 DEBUG GeditTabDecorator - _adjust_editor: URI has changed 2012-03-31 22:14:27,965 DEBUG LaTeXCompletionHandler - init 2012-03-31 22:14:27,966 DEBUG LanguageModelFactory - Pickled object found: /home/abcd/.gnome2/gedit/plugins/GeditLaTeXPlugin/latex.pkl 2012-03-31 22:14:28,075 DEBUG CompletionDistributor - init 2012-03-31 22:14:28,078 DEBUG WindowContext - Created view LaTeXOutlineView 2012-03-31 22:14:28,078 DEBUG WindowContext - Created view IssueView 2012-03-31 22:14:28,079 DEBUG LaTeXEditor - init(file:///home/abcd/dir1/file1.tex) 2012-03-31 22:14:28,079 DEBUG LaTeXEditor - Parsing document... 2012-03-31 22:14:28,080 DEBUG IssueView - init 2012-03-31 22:14:28,082 DEBUG IssueView - init finished 2012-03-31 22:14:28,092 INFO LaTeXEditor - LaTeXParser.parse: 0.010000 2012-03-31 22:14:28,092 DEBUG LaTeXEditor - Parsed 1599 bytes of content 2012-03-31 22:14:28,093 DEBUG LaTeXOutlineView - set_outline 2012-03-31 22:14:28,093 DEBUG LaTeXOutlineView - init 2012-03-31 22:14:28,097 DEBUG LaTeXValidator - validate 2012-03-31 22:14:28,098 DEBUG LanguageModel - set_newcommands: 2012-03-31 22:14:28,102 DEBUG LaTeXEditor - Parsing finished 2012-03-31 22:14:28,105 DEBUG GeditWindowDecorator - ---------- ADJUST: .tex 2012-03-31 22:14:28,119 DEBUG GeditWindowDecorator - _set_selected_bottom_view: 0 2012-03-31 22:14:28,120 DEBUG GeditWindowDecorator - _set_selected_side_view: 0 I am not sure if the gedit-latex-plugin is working properly or is it facing some problem. Why are there so many debug messages? Can anybody please suggest what I should do?

    Read the article

  • Is this the most effect simple way to display a moving image? SDL2

    - by user36324
    I've looked around for tutorials on SDL2, but there isnt many so I am curious i was messing around and is this an effective way to move an image. One problem is that it drags along the image to where it moves. #include "SDL.h" #include "SDL_image.h" int main(int argc, char* argv[]) { bool exit = false; SDL_Init(SDL_INIT_EVERYTHING); SDL_Window *win = SDL_CreateWindow("Hello World!", 100, 100, 640, 480, SDL_WINDOW_SHOWN); SDL_Renderer *ren = SDL_CreateRenderer(win, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC); SDL_Surface *png = IMG_Load("character.png"); SDL_Rect src; src.x = 0; src.y = 0; src.w = 161; src.h = 159; SDL_Rect dest; dest.x = 50; dest.y = 50; dest.w = 161; dest.h = 159; SDL_Texture *tex = SDL_CreateTextureFromSurface(ren, png); SDL_FreeSurface(png); while(exit==false){ dest.x++; SDL_RenderClear(ren); SDL_RenderCopy(ren, tex, &src, &dest); SDL_RenderPresent(ren); } SDL_Delay(5000); SDL_DestroyTexture(tex); SDL_DestroyRenderer(ren); SDL_DestroyWindow(win); SDL_Quit(); }

    Read the article

  • Achieve anisotropic filtering

    - by fedab
    I want to set anisotropic filtering to my scene. I use SharpDX (DirectX 11) and C#. How do i set up anisotropic filtering in my shader? Currently i try that in the shader: Texture2D tex; sampler textureSampler = sampler_state { Texture = (tex); MipFilter = Anisotropic; MagFilter = Anisotropic; MinFilter = Anisotropic; MaxAnisotropy = 16; }; float4 PShader(float4 position : SV_POSITION, float4 color:COLOR, float2 tex0 : TEXCOORD0) : SV_TARGET { float4 textureColor; textureColor = tex.Sample(textureSampler, tex0) * color; return textureColor; } I get my object, textured, but it is not filtered anisotropic. I can write everything in the Parameters, even invalid things and i don't get any errors. The result is the same, objects without applied anisotropic filtering. Do i have to set that in the shader? Can i do that also with SamplerState? I tested that but i didn't get a result too. Some steps what i have to set would be helpful.

    Read the article

  • Cannot get document.getElementById() to find my textarea

    - by Slruh
    Maybe I've been working on my site for to long, but I can't get the following to work. I am having my textarea fire an onkeyup() event called limiter which is supposed to check the textarea and limit the text in the box, while updated another readonly input field that shows the amount of characters left. This is the javascript code: <script type="text/javascript"> var count = "500"; function limiter(){ var comment = document.getElementById("comment"); var form = this.parent; var tex = comment.value; var len = tex.length; if(len > count){ tex = tex.substring(0,count); comment.value =tex; return false; } form.limit.value = count-len; } </script> The form looks like this: <form id="add-course-rating" method="post" action="/course_ratings/add/8/3/5/3" accept- charset="utf-8"><div style="display:none;"><input type="hidden" name="_method" value="POST" /> //Other inputs here <div id="comment-name" style="margin-top:10px"> <div id="comment-name-text"> <b>Comments</b><br /> Please leave any comments that you think will help anyone else. </div> <style type="text/css"> .rating-form-box textarea { -moz-border-radius:5px 5px 5px 5px; } </style> <div class="rating-form-box"> <textarea name="data[CourseRatings][comment]" id="comment" onkeyup="limiter()" cols="115" rows="5" ></textarea> <script type="text/javascript"> document.write("<input type=text name=limit size=4 readonly value="+count+">"); </script> </div> <input type="submit" value="Add Rating" style="float: right;"> </form> If anyone can help that would be great.

    Read the article

  • GLSL subroutine not being used

    - by amoffat
    I'm using a gaussian blur fragment shader. In it, I thought it would be concise to include 2 subroutines: one for selecting the horizontal texture coordinate offsets, and another for the vertical texture coordinate offsets. This way, I just have one gaussian blur shader to manage. Here is the code for my shader. The {{NAME}} bits are template placeholders that I substitute in at shader compile time: #version 420 subroutine vec2 sample_coord_type(int i); subroutine uniform sample_coord_type sample_coord; in vec2 texcoord; out vec3 color; uniform sampler2D tex; uniform int texture_size; const float offsets[{{NUM_SAMPLES}}] = float[]({{SAMPLE_OFFSETS}}); const float weights[{{NUM_SAMPLES}}] = float[]({{SAMPLE_WEIGHTS}}); subroutine(sample_coord_type) vec2 vertical_coord(int i) { return vec2(0.0, offsets[i] / texture_size); } subroutine(sample_coord_type) vec2 horizontal_coord(int i) { //return vec2(offsets[i] / texture_size, 0.0); return vec2(0.0, 0.0); // just for testing if this subroutine gets used } void main(void) { color = vec3(0.0); for (int i=0; i<{{NUM_SAMPLES}}; i++) { color += texture(tex, texcoord + sample_coord(i)).rgb * weights[i]; color += texture(tex, texcoord - sample_coord(i)).rgb * weights[i]; } } Here is my code for selecting the subroutine: blur_program->start(); blur_program->set_subroutine("sample_coord", "vertical_coord", GL_FRAGMENT_SHADER); blur_program->set_int("texture_size", width); blur_program->set_texture("tex", *deferred_output); blur_program->draw(); // draws a quad for the fragment shader to run on and: void ShaderProgram::set_subroutine(constr name, constr routine, GLenum target) { GLuint routine_index = glGetSubroutineIndex(id, target, routine.c_str()); GLuint uniform_index = glGetSubroutineUniformLocation(id, target, name.c_str()); glUniformSubroutinesuiv(target, 1, &routine_index); // debugging int num_subs; glGetActiveSubroutineUniformiv(id, target, uniform_index, GL_NUM_COMPATIBLE_SUBROUTINES, &num_subs); std::cout << uniform_index << " " << routine_index << " " << num_subs << "\n"; } I've checked for errors, and there are none. When I pass in vertical_coord as the routine to use, my scene is blurred vertically, as it should be. The routine_index variable is also 1 (which is weird, because vertical_coord subroutine is the first listed in the shader code...but no matter, maybe the compiler is switching things around) However, when I pass in horizontal_coord, my scene is STILL blurred vertically, even though the value of routine_index is 0, suggesting that a different subroutine is being used. Yet the horizontal_coord subroutine explicitly does not blur. What's more is, whichever subroutine comes first in the shader, is the subroutine that the shader uses permanently. Right now, vertical_coord comes first, so the shader blurs vertically always. If I put horizontal_coord first, the scene is unblurred, as expected, but then I cannot select the vertical_coord subroutine! :) Also, the value of num_subs is 2, suggesting that there are 2 subroutines compatible with my sample_coord subroutine uniform. Just to re-iterate, all of my return values are fine, and there are no glGetError() errors happening. Any ideas?

    Read the article

  • How to include multiple tables programmaticaly into a Sweave document using R

    - by PaulHurleyuk
    Hello, I want to have a sweave document that will include a variable number of tables in. I thought the example below would work, but it doesn't. I want to loop over the list foo and print each element as it's own table. % \documentclass[a4paper]{article} \usepackage[OT1]{fontenc} \usepackage{longtable} \usepackage{geometry} \usepackage{Sweave} \geometry{left=1.25in, right=1.25in, top=1in, bottom=1in} \listfiles \begin{document} <<label=start, echo=FALSE, include=FALSE>>= startt<-proc.time()[3] library(RODBC) library(psych) library(xtable) library(plyr) library(ggplot2) options(width=80) #Produce some example data, here I'm creating some dummy dataframes and putting them in a list foo<-list() foo[[1]]<-data.frame(GRP=c(rep("AA",10), rep("Aa",10), rep("aa",10)), X1=rnorm(30), X2=rnorm(30,5,2)) foo[[2]]<-data.frame(GRP=c(rep("BB",10), rep("bB",10), rep("BB",10)), X1=rnorm(30), X2=rnorm(30,5,2)) foo[[3]]<-data.frame(GRP=c(rep("CC",12), rep("cc",18)), X1=rnorm(30), X2=rnorm(30,5,2)) foo[[4]]<-data.frame(GRP=c(rep("DD",10), rep("Dd",10), rep("dd",10)), X1=rnorm(30), X2=rnorm(30,5,2)) @ \title{Docuemnt to test putting a variable number of tables into a sweave Document} \author{"Paul Hurley"} \maketitle \section{Text} This document was created on \today, with \Sexpr{print(version$version.string)} running on a \Sexpr{print(version$platform)} platform. It took approx \input{time} sec to process. <<label=test, echo=FALSE, results=tex>>= cat("Foo") @ that was a test, so is this <<label=table1test, echo=FALSE, results=tex>>= print(xtable(foo[[1]])) @ \newpage \subsection{Tables} <<label=Tables, echo=FALSE, results=tex>>= for(i in seq(foo)){ cat("\n") cat(paste("Table_",i,sep="")) cat("\n") print(xtable(foo[[i]])) cat("\n") } #cat("<<label=endofTables>>= ") @ <<label=bye, include=FALSE, echo=FALSE>>= endt<-proc.time()[3] elapsedtime<-as.numeric(endt-startt) @ <<label=elapsed, include=FALSE, echo=FALSE>>= fileConn<-file("time.tex", "wt") writeLines(as.character(elapsedtime), fileConn) close(fileConn) @ \end{document} Here, the table1test chunk works as expected, and produced a table based on the dataframe in foo[[1]], however the loop only produces Table(underscore)1.... Any ideas what I'm doing wrong ?

    Read the article

  • Matlab set defaultTextInterpreter to LaTeX

    - by Maurits
    I am running Matlab R2010A on OS X 10.7.5 I have a simple matlab plot and would like to use LaTeX commands in the axis and legend. However setting: set(0, 'defaultTextInterpreter', 'latex'); Has zero effect, and results in a TeX warning that my tex commands can not be parsed. If I open plot tools of this plot, the default interpreter is set to 'TeX'. Manually setting this to 'LaTeX' obviously fixes this, but I can't do this for hundreds of plots. Now, if I retrieve the default interpreter via the Matlab prompt, i.e get(0,'DefaultTextInterpreter') It says 'LaTeX', but again, when I look in the properties of the figure via the plot tools menu, the interpreter remains set to 'TeX'. Complete plotting code: figure f = 'somefile.eps' set(0, 'defaultTextInterpreter', 'latex'); ms = 8; fontSize = 18; loglog(p_m_sip, p_fa_sip, 'ko-.', 'LineWidth', 2, 'MarkerSize', ms); hold on; xlabel('$P_{fa}$', 'fontsize', fontSize); ylabel('$P_{m}$', 'fontsize', fontSize); legend('$\textbf{K}_{zz}$', 'Location', 'Best'); set(gca, 'XMinorTick', 'on', 'YMinorTick', 'on', 'YGrid', 'on', 'XGrid', 'on'); print('-depsc2', f);

    Read the article

  • Mercurial hg Subrepository Problem - "abort: unknown revision'

    - by Tex
    Note: I asked this yesterday over at kiln.stackexchange.com, but haven't gotten an answer, and it's holding up my work. So I figured I'd give it a shot here. My main mercurial repository has a bunch of subrepositories in it. During initial setup, I made a mistake in my .hgsub. Namely, I pointed two subrepositories to the same directory. What I should have had: sites/1=sites/1 sites/2=sites/2 sites/3=sites/3 What I actually had: sites/1=sites/1 sites/2=sites/2 sites/2=sites/3 Stupid copy/paste error. I committed the incorrect .hgsub, not realizing my error. A few revisions later, while adding a some new subrespositories to .hgsub, I noticed the mistake and fixed it inside .hgsub. I committed and kept rolling along. I've committed a reasonable amount of work that I'd prefer not to redo since I 'fixed' the mistake in .hgsub. Now we come to the actual problem: I've made some changes inside the subrepository sites/3, and when I try to commit the main repository, I get the following error: abort: unknown revision 'LongGUIDLookingString' I found this discussion, which seems to address the same problem I'm having, but I can't quite work out how bos fixed it. What do I need to do in order to fix this?

    Read the article

  • ASP.NET 2.0 app runs on Win 2003 in IIS 5 isolation mode but not in (default) IIS 6 mode

    - by Tex
    The app uses DLLImport to call a legacy unmanaged dll. Let's call this dll Unmanaged.dll for the sake of this question. Unmanaged.dll has dependencies on 5 other legacy dll's. All of the legacy dll's are placed in the WebApp/bin/ directory of my ASP.NET application. When IIS is running in 5.0 isolation mode, the app works fine - calls to the legacy dll are processed without error. When IIS is running in the default 6.0 mode, the app is able to initiate the Unmanaged.dll (InitMe()), but dies during a later call to it (ProcessString()). I'm pulling my hair out here. I've moved the unmanaged dll's to various locations, tried all kinds of security settings and searched long and hard for a solution. Help! Sample code: [DllImport("Unmanaged.dll", EntryPoint="initME", CharSet=System.Runtime.InteropServices.CharSet.Ansi, CallingConvention=CallingConvention.Cdecl)] internal static extern int InitME(); //Calls to InitMe work fine - Unmanaged.dll initiates and writes some entries in a dedicated log file [DllImport("Unmanaged.dll", EntryPoint="processString", CharSet=System.Runtime.InteropServices.CharSet.Ansi, CallingConvention=CallingConvention.Cdecl)] internal static extern int ProcessString(string inStream, int inLen, StringBuilder outStream, ref int outLen, int maxLen); //Calls to ProcessString cause the app to crash, without leaving much of a trace that I can find so far

    Read the article

  • Using hg repository as web site

    - by Tex
    This is somewhat related to my security question here. Is it a bad idea to use an hg / mercurial repository for a live website? If so, why? Furthermore, we have dev, test and production installations of our website, like dev.example.com, test.example.com and www.example.com. If it's a bad idea to use a repository for a live/production website, would it be OK to use an hg repository for the dev and test sites? I'm also concerned about ease of deployment. We have technical and less technical co-workers who will be working with the site. The technical guys (software engineers) won't have any problem working with the command line or TortoiseHG. I'm more concerned about the less technical guys (web designers). They won't be comfortable working on the command line, and may even find TortoiseHG daunting. These guys mostly upload .css files and images to the server. I'd like for these files (at least the .css files) to be under version control, but I want this to be as transparent as possible for the non technical guys. What's the best way to achieve this? Edit: Our 'site' is actually a multi-site CMS setup with a main repository and several subrepositories. Mock-up of the repository structure: /root [main repository containing core files and subrepositories] /modules [modules subrepository] /sites/global [subrepository for global .css and .php files] /sites/site1 [site1 subrepository] ... /sites/siteN [siteN subrepository] Software engineers would work in the root, modules and sites/global repositories. Less technical guys (web designers) would work only in the site1 ... siteN subrepositories.

    Read the article

  • .NET client getting "not well formed" XML response from Axis web service

    - by Tex
    I have a simple .NET app that makes a SOAP call to a third party Axis web service. When I trace the HTTP traffic, I see that the Request looks fine, however I'm getting an exception: "Response is not well-formed XML." The return object is null, as it seems the XML can't be deserialized. One question regarding the various namespace declarations inside the wsdl. Several of these declarations point to URLs / domains that no longer exist. Could this cause any problems? From the wsdl document: <wsdl:definitions targetNamespace="http://domaindoesntexist.com/" xmlns:apachesoap="http://xml.apache.org/xml-soap" xmlns:impl="http://domaindoesntexist.com/" xmlns:intf="http://domaindoesntexist.com/" xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" xmlns:wsdlsoap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> A sample HTTP response with incriminating data removed: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Type: text/xml;charset=utf-8 Transfer-Encoding: chunked Date: Fri, 05 Jun 2009 13:54:59 GMT 7cb <?xml version="1.0" encoding="utf-8"?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <soapenv:Body> <someMethod xmlns="http://test.com/services/myservice/"> </someMethod> </soapenv:Body> </soapenv:Envelope> 0

    Read the article

  • Vim: change the quick fix title

    - by romeovs
    I'm using following makeprg to get my tex files compiled in vim: setlocal makeprg=pdflatex\ \-file\-line\-error\ \-shell\-escape\ \-interaction=nonstopmode\ $*\\\|\ tee\ \/dev\/tty\ \\\|\ grep\ \-P\ ':\\d{1,5}:\ ' which yields great results (errors displayed properly, tex compilation shown while busy,...) Yet, when there are errors and the quickfix window pops up, its status bar is cluttered up with the makeprg string: pdflatex\ \-file\-line\-error\ \-shell\-escape\ \-interaction=nonstopmode\ $*\\\|\ tee\ \/dev\/tty\ \\\|\ grep\ \-P\ ':\\d{1,5}:\ ' Is there a way of changing the quickfix title/statusbar?

    Read the article

  • How do I change the quickix title (status bar) in vim?

    - by romeovs
    I'm have the following makeprg to compile my tex files in vim: setlocal makeprg=pdflatex\ \-file\-line\-error\ \-shell\-escape\ \-interaction=nonstopmode\ $*\\\|\ tee\ \/dev\/tty\ \\\|\ grep\ \-P\ ':\\d{1,5}:\ ' which gives me good results (errors displayed properly, tex compilation shown while busy,...) Yet there is one thing I'm not pleased off: when there are errors and the quickfix window pops up, its status bar is cluttered up with the makeprg string: pdflatex\ \-file\-line\-error\ \-shell\-escape\ \-interaction=nonstopmode\ $*\\\|\ tee\ \/dev\/tty\ \\\|\ grep\ \-P\ ':\\d{1,5}:\ ' Is there a way of changing the quickfix title/statusbar?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >