Search Results

Search found 3857 results on 155 pages for 'adobe after effects'.

Page 130/155 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • Least CPU intensive way of streaming your screen on windows?

    - by sinni800
    Hello, sometimes I like capturing my screen for others to see. Only thing: I am playing games while I do it. I have tried a few streaming solutions where Windows Media Encoder coupled with my own Windows server appealed to me most, because I can change resolutions, etc. I also tried ustream coupled with the Flash applet and the Adobe Flash Encoder recording a Camtasia source. Camtasia has the disadvantage though that it shows the green-and-black-alternating borders and can not be targeted fullscreen. I like how xfire does it. But it doesn't work with every game, many are simply not supported. A few thoughts about this: Is there a program which captures like Fraps or XFire (based on Direct3D and OpenGL outputs) and exposes the output to a DirectShow source filter? Which brings me to: Is there hardware accelerated capturing directly from the graphics card? Maybe including direct encoding with help from OpenCL? Modern graphic cards decode BluRay content directly for example. I should have a modern enough graphics processor for this to be possible (see below). If using Windows Media Encoder: Which are the least CPU intensive settings? Which codec? Is there a newer codec than Windows Media 9? Is it less CPU intensive? I only have 7, 8 and 9 inside the Encoder Could the performance be massively increased by having a Quad-Core CPU (see below)? Bandwidth is no problem up to 1000 to 1500 kbit/s (I have 2048). My Computer specs: Intel Core 2 Duo E8400 4 GB DDR2-800 Ram Ati Radeon HD5770 Using Windows 7 Professional

    Read the article

  • Why is Mac supposedly better than Windows for graphics?

    - by Svish
    Ok, people just keep telling me that if you're going to be working with graphics and design and stuff, you should get a Mac. And I just don't get the logic. Because most of these people would be working with Adobe software, which are for both Windows and Mac. To me it seems like their whole argument is based on that "everyone else does". Like, Mac had some graphics software that Windows didn't earlier in history, so most people were using Mac. And since most people were using Mac, new people also started using Mac. And since most people were using Mac, schools and universities used Mac. Which taught new people to use Mac. So they were using Mac. And told everyone they met that everyone they knew were using Mac. And so on. Anyways... What is the deal really? Is there actually any advantage in using Mac for graphics and design and such things? My take is that you pretty much have the same software and both Mac and Windows are powerful enough, support enough RAM, are stable (as long as you don't install lot's of junk or faulty drivers), et cetera. So, can anyone give me a good explanation on this? Is there a real difference or are people just brainwashed?

    Read the article

  • Why are FMS logs filled with 'play' event status code 408 for a failed webcast?

    - by Stu Thompson
    Recently we had a live webcast event go horribly wrong. I'm doing the technical post-mortem, with limited information. We know that the hardware encoder (a Digital Rapid Touch Stream Web HDI) was unable to send upstream at a sustained reliable high rate. What we don't know is if the encoder's connection was problematic (Zürich), or that of the streaming server (in Frankfurt). Unfortunately, I've got three different vendors all blaming each other (the CDN who runs the server, the on-site ISP and the on-site encoding team.) In the FMS log files I see a couple of interesting things: Zillions of Status Code 408 on play event entries for clients. Adobe's documentation stats that this "Stream stopped because client disconnected". ("Zillions" would be a ratio of 10 events for every individual IP address.) Several unpublish / (re)publish events per hour for the encoder I'd like to know if all those 408s could tell me with authority that the FMS server was starved for bandwidth, or that the encoding signal was starved (and hence the server was disconnecting clients.) Any clues?

    Read the article

  • Bad results converting PDF to EPS on Linux

    - by Tim
    I'm having some trouble converting PDFs (created by Adobe Illustrator on a Mac) to EPS. I have tried several things but I am wondering if there is a better option. The following list is ordered by decreasing quality: inkscape --export-area-page --export-eps=out.eps in.pdf using the graphical program Inkscape works best, but is a bit slow; pdftops -eps in.pdf out.eps uses Poppler and works good and is fast; pdf2ps in.pdf out.eps uses ghostscript and works ok for simple documents; convert in.pdf out.eps uses ImageMagick and always rasterizes the image. I haven't tested the following: acroread -toPostScript use acroread (Linux only) Some issues I've found: Transparency is not supported in EPS, but instead of flattening the layers, most programs rasterize the image producing big files and ugly graphs. Inkscape does this best by only rasterizing the unsupported area. Gradients are rendered properly by Inkscape, but Poppler somehow chops up the gradient into many shapes of different colors. Greek symbols are seemingly not supported by Ghostscript and are rasterized (using pdf2ps). What are your experiences for this kind of task? Did I forgot certain programs and/or command line options that improve quality? I found some posts on this, but not a (thorough) comparison of possibilities, please correct me if I'm wrong. Related posts How to convert PDF to EPS? on TeX

    Read the article

  • Relax Linux - it's just me! (filesystem permissions)

    - by Xeoncross
    One of my favorite things about Linux is also the most annoying - file system permissions. In production machines and web servers I love how everything is so secure and locked down - but on development machines it really slows me down. I'll give one example out of the many that I discover weekly. Like most people, I dual-boot Ubuntu and Windows so I can continue using the Adobe CS4 suite. I often design web themes and other things while I'm still using windows. Later I'll boot into Ubuntu to take the themes and write the backend PHP for them. After mounting the windows C: drive partition I can copy the template files over so I can begin editing them. However, thanks to Linux desire to protect me I find that after coping the files I end up with a totally locked set of files where even I don't have read-write permissions. So after carful consideration about the tremendous risks that the HTML files pose to me - I chmod them so that I and apache can begin using them. Now given, the chmod process isn't that hard - but after you chmod enough files per day you get sick of doing it. I'm constantly creating, fetch, editing, and removing files from my user, git repos, php, or other random processes. This is a personal development machine after all. Everything changes on a day by day basis. So my question is, how can I get linux to relax about what I'm doing with my HTML/JS/PHP/TXT/SQL/etc. files so that I can work faster without constantly stopping to chmod things? I pinky-promise I won't hack into my account with an HTML file. ;)

    Read the article

  • Shell command slow when using pipe, fast with intermediate file

    - by plang
    Does anyone understand this huge difference in processing time, when using an intermediate file, or when using a pipe? I'm converting tiff to pdf using standard tools on a fresh debian squeeze server. A standard way of doing this is to convert to ps first. Without pipe: root@web5:~# time tiff2ps test.tif > test.ps real 0m0.860s user 0m0.744s sys 0m0.112s root@web5:~# time ps2pdf13 -sPAPERSIZE=a4 test.ps > test.pdf real 0m0.667s user 0m0.612s sys 0m0.060s With pipe: root@web5:~# time tiff2ps test.tif | ps2pdf13 -sPAPERSIZE=a4 - > test.pdf real 1m6.098s user 0m15.861s sys 0m50.9 During the last command, gs process is at 100% all the time. Update: Here is an strace output for the ps generation: root@web5:~# strace tiff2ps test.tif > test.ps execve("/usr/bin/tiff2ps", ["tiff2ps", "test.tif"], [/* 28 vars */]) = 0 brk(0) = 0x1395000 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) mmap(NULL, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb5a1937000 access("/etc/ld.so.preload", R_OK) = -1 ENOENT (No such file or directory) open("/etc/ld.so.cache", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=21735, ...}) = 0 mmap(NULL, 21735, PROT_READ, MAP_PRIVATE, 3, 0) = 0x7fb5a1931000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/usr/lib/libtiff.so.4", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0P\200\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0644, st_size=405128, ...}) = 0 mmap(NULL, 2501416, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7fb5a14b9000 mprotect(0x7fb5a151a000, 2093056, PROT_NONE) = 0 mmap(0x7fb5a1719000, 12288, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x60000) = 0x7fb5a1719000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/usr/lib/libjpeg.so.62", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\3408\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0644, st_size=145048, ...}) = 0 mmap(NULL, 2240080, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7fb5a1296000 mprotect(0x7fb5a12b9000, 2093056, PROT_NONE) = 0 mmap(0x7fb5a14b8000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x22000) = 0x7fb5a14b8000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/usr/lib/libz.so.1", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\260\"\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0644, st_size=93936, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb5a1930000 mmap(NULL, 2188976, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7fb5a107f000 mprotect(0x7fb5a1096000, 2093056, PROT_NONE) = 0 mmap(0x7fb5a1295000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x16000) = 0x7fb5a1295000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/lib/libm.so.6", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\360>\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0644, st_size=530736, ...}) = 0 mmap(NULL, 2625768, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7fb5a0dfd000 mprotect(0x7fb5a0e7d000, 2097152, PROT_NONE) = 0 mmap(0x7fb5a107d000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x80000) = 0x7fb5a107d000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/lib/libc.so.6", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\240\355\1\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=1437064, ...}) = 0 mmap(NULL, 3545160, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7fb5a0a9b000 mprotect(0x7fb5a0bf4000, 2093056, PROT_NONE) = 0 mmap(0x7fb5a0df3000, 20480, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x158000) = 0x7fb5a0df3000 mmap(0x7fb5a0df8000, 18504, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x7fb5a0df8000 close(3) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb5a192f000 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb5a192e000 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb5a192d000 arch_prctl(ARCH_SET_FS, 0x7fb5a192e700) = 0 mprotect(0x7fb5a0df3000, 16384, PROT_READ) = 0 mprotect(0x7fb5a107d000, 4096, PROT_READ) = 0 mprotect(0x7fb5a1939000, 4096, PROT_READ) = 0 munmap(0x7fb5a1931000, 21735) = 0 open("test.tif", O_RDONLY) = 3 brk(0) = 0x1395000 brk(0x13b6000) = 0x13b6000 read(3, "II*\0\10\0\0\0", 8) = 8 fstat(3, {st_mode=S_IFREG|0644, st_size=1825656, ...}) = 0 mmap(NULL, 1825656, PROT_READ, MAP_SHARED, 3, 0) = 0x7fb5a176f000 open("/proc/meminfo", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0444, st_size=0, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb5a1936000 read(4, "MemTotal: 2090844 kB\nMemF"..., 1024) = 1024 close(4) = 0 munmap(0x7fb5a1936000, 4096) = 0 write(2, "TIFFReadDirectory: ", 19TIFFReadDirectory: ) = 19 write(2, "Warning, ", 9Warning, ) = 9 write(2, "test.tif: wrong data type 7 for "..., 59test.tif: wrong data type 7 for "RichTIFFIPTC"; tag ignored) = 59 write(2, ".\n", 2. ) = 2 gettimeofday({1334836895, 374666}, NULL) = 0 fstat(1, {st_mode=S_IFREG|0664, st_size=0, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb5a1936000 open("/etc/localtime", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0644, st_size=1892, ...}) = 0 fstat(4, {st_mode=S_IFREG|0644, st_size=1892, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb5a1935000 read(4, "TZif2\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\4\0\0\0\4\0\0\0\0"..., 4096) = 1892 lseek(4, -1217, SEEK_CUR) = 675 read(4, "TZif2\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\6\0\0\0\6\0\0\0\0"..., 4096) = 1217 close(4) = 0 munmap(0x7fb5a1935000, 4096) = 0 write(1, "%!PS-Adobe-3.0 EPSF-3.0\n%%Creato"..., 4096) = 4096 write(1, "fffffffffffffffffffffffffffff\nff"..., 4096) = 4096 write(1, "ffffffffffffffffffff\nfffffffffff"..., 4096) = 4096 write(1, "fffffffffff\nffffffffffffffffffff"..., 4096) = 4096 write(1, "ff\nfffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffff\nfffffff"..., 4096) = 4096 Here is an strace output for the piped version: PS generation seems to be much slower when output is piped into ps2pdf13. root@web5:~# strace tiff2ps test.tif | ps2pdf13 -sPAPERSIZE=a4 - > test.pdf execve("/usr/bin/tiff2ps", ["tiff2ps", "test.tif"], [/* 28 vars */]) = 0 brk(0) = 0x1b97000 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) mmap(NULL, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f9208bb1000 access("/etc/ld.so.preload", R_OK) = -1 ENOENT (No such file or directory) open("/etc/ld.so.cache", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=21735, ...}) = 0 mmap(NULL, 21735, PROT_READ, MAP_PRIVATE, 3, 0) = 0x7f9208bab000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/usr/lib/libtiff.so.4", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0P\200\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0644, st_size=405128, ...}) = 0 mmap(NULL, 2501416, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7f9208733000 mprotect(0x7f9208794000, 2093056, PROT_NONE) = 0 mmap(0x7f9208993000, 12288, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x60000) = 0x7f9208993000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/usr/lib/libjpeg.so.62", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\3408\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0644, st_size=145048, ...}) = 0 mmap(NULL, 2240080, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7f9208510000 mprotect(0x7f9208533000, 2093056, PROT_NONE) = 0 mmap(0x7f9208732000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x22000) = 0x7f9208732000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/usr/lib/libz.so.1", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\260\"\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0644, st_size=93936, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f9208baa000 mmap(NULL, 2188976, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7f92082f9000 mprotect(0x7f9208310000, 2093056, PROT_NONE) = 0 mmap(0x7f920850f000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x16000) = 0x7f920850f000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/lib/libm.so.6", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\360>\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0644, st_size=530736, ...}) = 0 mmap(NULL, 2625768, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7f9208077000 mprotect(0x7f92080f7000, 2097152, PROT_NONE) = 0 mmap(0x7f92082f7000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x80000) = 0x7f92082f7000 close(3) = 0 access("/etc/ld.so.nohwcap", F_OK) = -1 ENOENT (No such file or directory) open("/lib/libc.so.6", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\240\355\1\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=1437064, ...}) = 0 mmap(NULL, 3545160, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x7f9207d15000 mprotect(0x7f9207e6e000, 2093056, PROT_NONE) = 0 mmap(0x7f920806d000, 20480, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x158000) = 0x7f920806d000 mmap(0x7f9208072000, 18504, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x7f9208072000 close(3) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f9208ba9000 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f9208ba8000 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f9208ba7000 arch_prctl(ARCH_SET_FS, 0x7f9208ba8700) = 0 mprotect(0x7f920806d000, 16384, PROT_READ) = 0 mprotect(0x7f92082f7000, 4096, PROT_READ) = 0 mprotect(0x7f9208bb3000, 4096, PROT_READ) = 0 munmap(0x7f9208bab000, 21735) = 0 open("test.tif", O_RDONLY) = 3 brk(0) = 0x1b97000 brk(0x1bb8000) = 0x1bb8000 read(3, "II*\0\10\0\0\0", 8) = 8 fstat(3, {st_mode=S_IFREG|0644, st_size=1825656, ...}) = 0 mmap(NULL, 1825656, PROT_READ, MAP_SHARED, 3, 0) = 0x7f92089e9000 open("/proc/meminfo", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0444, st_size=0, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f9208bb0000 read(4, "MemTotal: 2090844 kB\nMemF"..., 1024) = 1024 close(4) = 0 munmap(0x7f9208bb0000, 4096) = 0 write(2, "TIFFReadDirectory: ", 19TIFFReadDirectory: ) = 19 write(2, "Warning, ", 9Warning, ) = 9 write(2, "test.tif: wrong data type 7 for "..., 59test.tif: wrong data type 7 for "RichTIFFIPTC"; tag ignored) = 59 write(2, ".\n", 2. ) = 2 gettimeofday({1334836513, 114140}, NULL) = 0 fstat(1, {st_mode=S_IFIFO|0600, st_size=0, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f9208bb0000 open("/etc/localtime", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0644, st_size=1892, ...}) = 0 fstat(4, {st_mode=S_IFREG|0644, st_size=1892, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f9208baf000 read(4, "TZif2\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\4\0\0\0\4\0\0\0\0"..., 4096) = 1892 lseek(4, -1217, SEEK_CUR) = 675 read(4, "TZif2\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\6\0\0\0\6\0\0\0\0"..., 4096) = 1217 close(4) = 0 munmap(0x7f9208baf000, 4096) = 0 write(1, "%!PS-Adobe-3.0 EPSF-3.0\n%%Creato"..., 4096) = 4096 write(1, "fffffffffffffffffffffffffffff\nff"..., 4096) = 4096 write(1, "ffffffffffffffffffff\nfffffffffff"..., 4096) = 4096 write(1, "fffffffffff\nffffffffffffffffffff"..., 4096) = 4096 write(1, "ff\nfffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffffffffffff"..., 4096) = 4096 write(1, "ffffffffffffffffffffffffffffffff"..., 4096) = 4096 ...etc...

    Read the article

  • Tool to modify properties/metadata of a PDF? i.e. Change "Title", "Author"? Sony Reader showing som

    - by Chris W. Rea
    I own a Sony Reader PRS-600 ebook reader. I bought a ton of Manning Publications ebooks (DRM-free) recently. Many of the books are PDFs since not all the ones I wanted are available in epub format. The problem: Some of the PDF books I purchased have incorrect or missing metadata. Making things worse, the Sony Reader only displays the "Title" from the PDF metadata when displaying book titles in the reader's collection of books! The Reader doesn't display the filename. So, even though I have a PDF informatively named "Windows PowerShell In Action.pdf", it shows up as "untitled" in the Reader. Imagine how useful the Reader's list of book titles becomes when many are just "untitled" or "unnamed document" ! Yes, it is maddening. So – short of expecting the publisher to fix the files or Sony to add a filename-based list instead, I'm looking for a way to fix the PDF metadata. I can view the metadata with Adobe Reader, but it doesn't permit modification of the properties. Leading to: Question: Is there a tool – free, or cheap – and either for PC or Mac, that can modify the properties / metadata of a DRM-free PDF document? I want to correct "Title" and "Author" fields, specifically.

    Read the article

  • Java 6 Certified with Forms and Reports 10g for EBS 12

    - by John Abraham
    Java 6 is now certified with Oracle Application Server 10g Forms and Reports with Oracle E-Business Suite Release 12 (12.0.6, 12.1.1 and higher). What? Wasn't this already certified? No, but a little background might be useful in understanding why this is a new announcement. We previously certified the use of Java 6 with E-Business Suite Release 12 -- with the sole exception of Oracle Application Server 10g components in the E-Business Suite technology stack. Oracle Application Server 10g originally included Java 1.4.2 as part of its distribution.  E-Business Suite 12 uses, amongst other things, the Oracle Forms and Reports 10g components running on Java 1.4. Java 1.4 in the Oracle Application Server 10g ORACLE_HOME is used exclusively by AS 10g Forms and Reports' for Java functionality.  This version of Java is separate from the Java distribution used by other parts of EBS such as Oracle Containers for Java (OC4J). What's new about this certification? You can now upgrade the older Java 1.4 libraries used by Oracle Forms & Reports 10g to Java 6. This allows you to upgrade the Java releases within the Oracle Application Server 10g ORACLE_HOME to the the same level as the rest of your E-Business Suite technology stack components. Why upgrade? This becomes particularly important for customers as individual vendors' support lifecycle for Java 1.4 reaches End of Life: Oracle's Sun JDK Release 1.4.2's End of Extended Support: February 2013 (Sustaining Support indefinitely after) IBM SDK and JRE 1.4.2's End of Service: September 2013 HP-UX Java 1.4.2's End-of-Life : May 2012 Along with Oracle Forms, Java lies at the heart of the Oracle E-Business Suite.  Small improvements in Java can have significant effects on the performance and stability of the E-Business Suite.  As a notable side-benefit, later versions of Java have improved built-in and third-party tools for JVM performance monitoring and tuning.Our standing recommendation is that you always stay current with the latest available Java update provided by your operating system vendor.  Don't forget to upgrade Forms & Reports to 10.1.2.3 E-Business Suite 12 originally shipped with Oracle Application Server 10g Forms & Reports 10.1.2.0.2.  That version is no longer eligible for Error Correction Support. New Forms and Reports 10g patches are now being released with Forms and Reports 10.1.2.3 as the prerequisite. Forms and Reports 10.1.2.3 was certified for EBS 12 environments in November 2008. If you haven't upgraded your EBS 12 environment to Forms & Reports 10.1.2.3, this is a good opportunity to do so. References Using Latest Update of Java 6.0 with Oracle E-Business Suite Release 12 (My Oracle Support Document 455492.1) Overview of Using Java with Oracle E-Business Suite Release 12 (My Oracle Support Document 418664.1) Oracle Lifetime Support Policy (Oracle Fusion Middleware) IBM Developer Kit Lifecycle Dates HP-UX Java - End of Life Policy & Release Naming Terminology Related Articles OracleAS 10g Forms and Reports 10.1.2.3 Certified With EBS R12 Java 6 Certified with E-Business Suite Release 12

    Read the article

  • xna orbit camera troubles

    - by user17753
    I have a Model named cube to which I load in LoadContent(): cube = Content.Load<Model>("untitled");. In the Draw Method I call DrawModel: private void DrawModel(Model m, Matrix world) { foreach (ModelMesh mesh in m.Meshes) { foreach (BasicEffect effect in mesh.Effects) { effect.EnableDefaultLighting(); effect.View = camera.View; effect.Projection = camera.Projection; effect.World = world; } mesh.Draw(); } } camera is of the Camera type, a class I've setup. Right now it is instantiated in the initialization section with the graphics aspect ratio and the translation (world) vector of the model, and the Draw loop calls the camera.UpdateCamera(); before drawing the models. class Camera { #region Fields private Matrix view; // View Matrix for Camera private Matrix projection; // Projection Matrix for Camera private Vector3 position; // Position of Camera private Vector3 target; // Point camera is "aimed" at private float aspectRatio; //Aspect Ratio for projection private float speed; //Speed of camera private Vector3 camup = Vector3.Up; #endregion #region Accessors /// <summary> /// View Matrix of the Camera -- Read Only /// </summary> public Matrix View { get { return view; } } /// <summary> /// Projection Matrix of the Camera -- Read Only /// </summary> public Matrix Projection { get { return projection; } } #endregion /// <summary> /// Creates a new Camera. /// </summary> /// <param name="AspectRatio">Aspect Ratio to use for the projection.</param> /// <param name="Position">Target coord to aim camera at.</param> public Camera(float AspectRatio, Vector3 Target) { target = Target; aspectRatio = AspectRatio; ResetCamera(); } private void Rotate(Vector3 Axis, float Amount) { position = Vector3.Transform(position - target, Matrix.CreateFromAxisAngle(Axis, Amount)) + target; } /// <summary> /// Resets Default Values of the Camera /// </summary> private void ResetCamera() { speed = 0.05f; position = target + new Vector3(0f, 20f, 20f); projection = Matrix.CreatePerspectiveFieldOfView(MathHelper.PiOver4, aspectRatio, 0.5f, 100f); CalculateViewMatrix(); } /// <summary> /// Updates the Camera. Should be first thing done in Draw loop /// </summary> public void UpdateCamera() { Rotate(Vector3.Right, speed); CalculateViewMatrix(); } /// <summary> /// Calculates the View Matrix for the camera /// </summary> private void CalculateViewMatrix() { view = Matrix.CreateLookAt(position,target, camup); } I'm trying to create the camera so that it can orbit the center of the model. For a test I am calling Rotate(Vector3.Right, speed); but it rotates almost right but gets to a point where it "flips." If I rotate along a different axis Rotate(Vector3.Up, speed); everything seems OK in that direction. So I guess, can someone tell me what I'm not accounting for in the above code I wrote? Or point me to an example of an orbiting camera that can be fixed on an arbitrary point?

    Read the article

  • Where in the stack is Software Restriction Policies implemented?

    - by Knox
    I am a big fan of Software Restriction Policies for Microsoft Windows and was recently updating our settings for this. I became curious as to where Microsoft implemented this technology in the stack. I can imagine a very naive implementation being in Windows Explorer where when you double click on an exe or other blocked file type, that Explorer would check against the policy. I call this naive because obviously this wouldn't protect against someone typing something in a CMD window. Or worse, Adobe Reader running an external application. On the other hand, I can imagine that software restriction policies could be implemented deep in the stack almost at the metal. In this case, the low level loader would load into memory the questionable file, but mark the memory in the memory manager as non-executable data. I'm pretty sure that Microsoft did not do the most naive implementation, because if I block Java using a path block, Internet Explorer will crash if it attempts to load Java. Which is what I want. But I'm not sure how deep in the stack it's implemented and any insight would be appreciated.

    Read the article

  • Better approach to archiving large amounts of original video footage using optical media (DVD/Blu-ra

    - by Rob
    This question is to share my experience as well as ask for suggestions for better methods. Along with 2 friends, I completed the making of a short documentary film in 2006. Clip is at: http://www.youtube.com/mediamotioninvision The film was edited in Adobe Premiere Pro 1.5 on Windows XP. More details and screenshot here: http://www.flickr.com/photos/smilingrobbie/1350235514/ ( note this is not intended to be a plug, we've moved on from this initial learning curve project ;) ) The film is in 4:3 standard definition 720x576 PAL format. As well as retaining the final 30minute film, I wanted to keep all original files that assembled together to make the film. The footage was 83.5Gb So I archived them to over 20 4.7Gb DVD recordables in the original .avi format (i.e. data DVD-ROM format, NOT DVD-Video Mpeg2) Some .avi DV video files were larger than 4.7Gb so I used 7-zip to split them ( here is a guide as to how to do that: http://www.linglom.com/2008/10/12/how-to-split-a-large-file-using-7-zip/ ) To recombine them, a dos shell command like this would do that: copy /b file.avi.* file.avi would do the job, where .* is a wild card to include all the split parts e.g. 001, 002...00n assuming they are all in the same directory path folder. file.avi is the recombined file identical to the original. Later on, I bought a LG BE06 LU10 USB 2.0 Super-multi Blu-ray burner and archived the footage to 2 (two) x 50Gb BD-R DL discs. Again in the original format, written as files to a BD-R in the BD-R BD-ROM UDF format readable by PC/Mac etc, NOT Blu-ray video/film format. This seems to be a good solution for me, because: the archive is in a robust, reasonably permanent, non-volatile medium, i.e. DVD recordable / Blu-ray (debates about stability of optical media organic chemical dye compounds/substrates aside) the format of the archive is accessible by open source tools or just plain Windows Explorer and it's not in a proprietary format I just thought I'd ask folks for their experience on better methods, if such exist.

    Read the article

  • User-friendly program to create editable & searchable pdf files, like tax & application forms and su

    - by Nick Gorbikoff
    Hello. Can somebody recommend user-friendly program that will create ( or convert from Excel & Word, or OpenOffice) editable pdf forms. You know like tax forms, that some of us filled out. Where you can create a form with predefined format and stationary, but let user edit/fill out fields. I need something user-friendly that a regular person can use. I'm NOT looking for a pdf library ( I already use wkhtmltopdf for generating pdfs programmaticaly). The reason is that we have about 400 documents ( internal expense forms, traing forms, etc) in .doc and .xls format that we want to convert to editable pdf ( so that people don't have to fill them out by hand). Coding 400 templates and then converting them using some lib or command line tool - is not my idea of fun, espsecially since those form change all the time. I'd like to just give HR and Quality department the tool, so that they can maintain those documents. I looked at everything listed on this page ( http://www.cogniview.com/convert-pdf-to-excel/post/pdf-editing-creation-50-open-sourcefree-alternatives-to-adobe-acrobat/ ), but can't find what I need. Thank you!!!

    Read the article

  • Is inconsistent formatting a sign of a sloppy programmer?

    - by dreza
    I understand that everyone has their own style of programming and that you should be able to read other people's styles and accept it for what it is. However, would one be considered a sloppy programmer if one's style of coding was inconsistent across whatever standard they were working against? Some example of inconsistencies might be: Sometimes naming private variables with _ and sometimes not Sometimes having varying indentations within code blocks Not aligning braces up i.e. same column if using start using new line style Spacing not always consistent around operators i.e. p=p+1, p+=1 vs other times p =p+1 or p = p + 1 etc Is this even something that as a programmer I should be concerned with addressing? Or is it such a minor nit picking thing that at the end of the day I should just not worry about it and worry about what the end user sees and whether the code works rather than how it looks while working? Is it sloppy programming or just over obsessive nit picking? EDIT: After some excellent comments I realized I may have left out some information in my question. This question came about after reviewing another colleagues code check-in and noticing some of these things and then realizing that I've seen these kind of in-consistencies in previous check-ins. It then got me thinking about my code and whether I do the same things and noticed that I typically don't etc I'm not suggesting his technique is either bad or good in this question or whether his way of doing things is right or wrong. EDIT: To answer some queries to some more good feed back. The specific instance this review occurred in was using Visual Studio 2010 and programming in c# so I don't think the editor would cause any issues. In fact it should only help I would hope. Sorry if I left that piece of info out and it effects any current answers. I was trying to be a bit more generic in understanding if this would be considered sloppy etc. And to add an even more specific example of a code piece I saw during reading of the check-in: foreach(var block in Blocks) { // .. some other code in here foreach(var movement in movements) { movement.Moved.Zero(); } // the un-formatted brace } Such a minor thing I know, but many small things add up(???), and I did have to double glance at the code at the time to see where everything lined up I guess. Please note this code was formatted appropriately before this check-in. EDIT: After reading some great answers and varying thoughts, the summary I've taken from this was. It's not necessarily a sign of a sloppy programmer however as programmers we have a duty (to ourselves and other programmers) to make the code as readable as possible to assist in further ongoing development. However it can hint at inadequacies which is something that is only possible to review on a case by case (person by person) basis. There are many reasons why this might occur. They should be taken in context and worked through with the person/people involved if reasonable. We have a duty to try and help all programmers become better programmers! In the good old days when development was done using good old notepad (or other simple text editing tool) this occurred much more frequently. However we have the assistance of modern IDE's now so although we shouldn't necessarily become OTT about this, it should still probably be addressed to some degree. We as programmers vary in our standards, styles and approaches to solutions. However it seems that in general we all take PRIDE in our work and as a trait it is something that can stand programmers apart. Making something to the best of our abilities both internal (code) and external (end user result) goes along way to giving us that big fat pat on the back that we may not go looking for but swells our heart with pride. And finally to quote CrazyEddie from his post below. Don't sweat the small stuff

    Read the article

  • Unable to Access Certain Websites

    - by codejoust
    Through a local network, all computers except one ubuntu machine can access 1. Adobe.com 2. Icann.org 3. Apache.org 4. Example.com. The ubuntu machine returns (in firefox): "Though the site seems valid, the browser was unable to establish a connection." Furthermore, when I traceroute those websites using the ubuntu machine, they all return ubuntu.local, and it ends there: (traceroute to icann.org (192.0.32.7), 30 hops max, 40 byte packets 1 ubuntu.local (192.168.1.105) 3000.791 ms !H 3000.808 ms !H 3000.814 ms !H I've checked the hosts file, and there isn't anything in there, and I have an apache server there so if it was redirected to localhost, I'd probably see the localhost webroot page. Thanks in advance! user@ubuntu:~$ netstat -nr Kernel IP routing table Destination Gateway Genmask Flags MSS Window irtt Iface 169.254.0.0 0.0.0.0 255.255.0.0 U 0 0 0 eth1 192.0.0.0 0.0.0.0 255.0.0.0 U 0 0 0 eth1 0.0.0.0 192.168.1.1 0.0.0.0 UG 0 0 0 eth1 The Ubuntu Machine is one of six on the network. I'm using opendns for dns, so I do think that should be a problem.

    Read the article

  • Standards Corner: Preventing Pervasive Monitoring

    - by independentid
     Phil Hunt is an active member of multiple industry standards groups and committees and has spearheaded discussions, creation and ratifications of industry standards including the Kantara Identity Governance Framework, among others. Being an active voice in the industry standards development world, we have invited him to share his discussions, thoughts, news & updates, and discuss use cases, implementation success stories (and even failures) around industry standards on this monthly column. Author: Phil Hunt On Wednesday night, I watched NBC’s interview of Edward Snowden. The past year has been tumultuous one in the IT security industry. There has been some amazing revelations about the activities of governments around the world; and, we have had several instances of major security bugs in key security libraries: Apple's ‘gotofail’ bug  the OpenSSL Heartbleed bug, not to mention Java’s zero day bug, and others. Snowden’s information showed the IT industry has been underestimating the need for security, and highlighted a general trend of lax use of TLS and poorly implemented security on the Internet. This did not go unnoticed in the standards community and in particular the IETF. Last November, the IETF (Internet Engineering Task Force) met in Vancouver Canada, where the issue of “Internet Hardening” was discussed in a plenary session. Presentations were given by Bruce Schneier, Brian Carpenter,  and Stephen Farrell describing the problem, the work done so far, and potential IETF activities to address the problem pervasive monitoring. At the end of the presentation, the IETF called for consensus on the issue. If you know engineers, you know that it takes a while for a large group to arrive at a consensus and this group numbered approximately 3000. When asked if the IETF should respond to pervasive surveillance attacks? There was an overwhelming response for ‘Yes'. When it came to 'No', the room echoed in silence. This was just the first of several consensus questions that were each overwhelmingly in favour of response. This is the equivalent of a unanimous opinion for the IETF. Since the meeting, the IETF has followed through with the recent publication of a new “best practices” document on Pervasive Monitoring (RFC 7258). This document is extremely sensitive in its approach and separates the politics of monitoring from the technical ones. Pervasive Monitoring (PM) is widespread (and often covert) surveillance through intrusive gathering of protocol artefacts, including application content, or protocol metadata such as headers. Active or passive wiretaps and traffic analysis, (e.g., correlation, timing or measuring packet sizes), or subverting the cryptographic keys used to secure protocols can also be used as part of pervasive monitoring. PM is distinguished by being indiscriminate and very large scale, rather than by introducing new types of technical compromise. The IETF community's technical assessment is that PM is an attack on the privacy of Internet users and organisations. The IETF community has expressed strong agreement that PM is an attack that needs to be mitigated where possible, via the design of protocols that make PM significantly more expensive or infeasible. Pervasive monitoring was discussed at the technical plenary of the November 2013 IETF meeting [IETF88Plenary] and then through extensive exchanges on IETF mailing lists. This document records the IETF community's consensus and establishes the technical nature of PM. The draft goes on to further qualify what it means by “attack”, clarifying that  The term is used here to refer to behavior that subverts the intent of communicating parties without the agreement of those parties. An attack may change the content of the communication, record the content or external characteristics of the communication, or through correlation with other communication events, reveal information the parties did not intend to be revealed. It may also have other effects that similarly subvert the intent of a communicator.  The past year has shown that Internet specification authors need to put more emphasis into information security and integrity. The year also showed that specifications are not good enough. The implementations of security and protocol specifications have to be of high quality and superior testing. I’m proud to say Oracle has been a strong proponent of this, having already established its own secure coding practices. 

    Read the article

  • Guest blog: A Closer Look at Oracle Price Analytics by Will Hutchinson

    - by Takin Babaei
    Overview:  Price Analytics helps companies understand how much of each sale goes into discounts, special terms, and allowances. This visibility lets sales management see the panoply of discounts and start seeing whether each discount drives desired behavior. In Price Analytics monitors parts of the quote-to-order process, tracking quotes, including the whole price waterfall and seeing which result in orders. The “price waterfall” shows all discounts between list price and “pocket price”. Pocket price is the final price the vendor puts in its pocket after all discounts are taken. The value proposition: Based on benchmarks from leading consultancies and companies I have talked to, where they have studied the effects of discounting and started enforcing what many of them call “discount discipline”, they find they can increase the pocket price by 0.8-3%. Yes, in today’s zero or negative inflation environment, one can, through better monitoring of discounts, collect what amounts to a price rise of a few percent. We are not talking about selling more product, merely about collecting a higher pocket price without decreasing quantities sold. Higher prices fall straight to the bottom line. The best reference I have ever found for understanding this phenomenon comes from an article from the September-October 1992 issue of Harvard Business Review called “Managing Price, Gaining Profit” by Michael Marn and Robert Rosiello of McKinsey & Co. They describe the outsized impact price management has on bottom line performance compared to selling more product or cutting variable or fixed costs. Price Analytics manages what Marn and Rosiello call “transaction pricing”, namely the prices of a given transaction, as opposed to what is on the price list or pricing according to the value received. They make the point that if the vendor does not manage the price waterfall, customers will, to the vendor’s detriment. It also discusses its findings that in companies it studied, there was no correlation between discount levels and any indication of customer value. I urge you to read this article. What Price Analytics does: Price analytics looks at quotes the company issues and tracks them until either the quote is accepted or rejected or it expires. There are prebuilt adapters for EBS and Siebel as well as a universal adapter. The target audience includes pricing analysts, product managers, sales managers, and VP’s of sales, marketing, finance, and sales operations. It tracks how effective discounts have been, the win rate on quotes, how well pricing policies have been followed, customer and product profitability, and customer performance against commitments. It has the concept of price waterfall, the deal lifecycle, and price segmentation built into the product. These help product and sales managers understand their pricing and its effectiveness on driving revenue and profit. They also help understand how terms are adhered to during negotiations. They also help people understand what segments exist and how well they are adhered to. To help your company increase its profits and revenues, I urge you to look at this product. If you have questions, please contact me. Will HutchinsonMaster Principal Sales Consultant – Analytics, Oracle Corp. Will Hutchinson has worked in the business intelligence and data warehousing for over 25 years. He started building data warehouses in 1986 at Metaphor, advancing to running Metaphor UK’s sales consulting area. He also worked in A.T. Kearney’s business intelligence practice for over four years, running projects and providing training to new consultants in the IT practice. He also worked at Informatica and then Siebel, before coming to Oracle with the Siebel acquisition. He became Master Principal Sales Consultant in 2009. He has worked on developing ROI and TCO models for business intelligence for over ten years. Mr. Hutchinson has a BS degree in Chemical Engineering from Princeton University and an MBA in Finance from the University of Chicago.

    Read the article

  • Day 5 - Tada! My Game Menu Screen Graphics

    - by dapostolov
    So, tonight I took some time to mash up some graphics for my game menu screen. My artistic talent sucks...but here goes nothing...voila, my menu screen!! The Menu Screen The screen above is displaying 4 sprites, even though it looks like maybe 7... I guess one of the first things for me to test in the future is ... is it more memory efficient (and better frame rate) to draw one big background image OR tp paint the screen black, and place each sprite in set locations? To display the 4 sprites above, I borrowed my code from yesterday ... I know, tacky, but...I wanted to see it, feel it. Do you feel it? FEEL IT! (homer voice & shakes fist) Note: the menu items won't scale properly as it stands with this code, well pretty much they do nothing except look pretty... Paint.Net & Google Fun So how did I create that image above? Well, to create the background and 3 menu items I used Paint.Net. Basically, I scoured Google images for: a stone doorway, a stone pillar, an old book, a wizards hat, and...that's it pretty much it! I'll let you type in those searches and see if you can locate the images I used. I know, bad developer...but I figured since I modified the images considerably it doesn't count...well for a personal project it shouldn't count...*shrug* Anyhow, I extracted each key assest I wanted from each image and applied lots of matting, blurring, color changes, glow effects and such. Then, using my vivid imagination I placed / composed each of the layered assets into the mashed up the "scene" above. Pretty cool, eh? Hey, did you know, the cool mist effect is actually a fire rendition in Paint.net? I set it to black & white with opacity set next to nothing. I'm also very proud of the yellow "light" in the stone doorway. I drew that in and then applied gausian blur to it to give it the effect of light creeping out around the door and into the room...heheh. So did I achieve the dark, mysterious ritual as I stated in my design doc? I think I had a great stab at it! Maybe down the road I can get a real artist to crank out some quality graphics for the game... =) So, What's Next? Well, I don't have that animated brazier yet...however, I thought it would be even cooler if I can get that door pulsing that yellow light and it would be extremely cool to have the smoke / mist moving across the screen! Make the creative ideas stop!! (clutches head) haha! I'm having great fun working on this project =) I recommend others giving something like this a try, it's really fulfilling. OK. Tomorrow... I think I'm going to start creating some game / menu objects as per the design doc, maybe even get a custom mouse cursor up on the screen and handle a couple of mouse events, and lastly, maybe a feature to toggle a framerate display... D.

    Read the article

  • Can Vista Windows Explorer be repaired/fixed?

    - by gurun8
    I'm running Windows Vista Home Premium on a Sony Vaio laptop. I think somehow my Windows Explorer has become corrupted. I don't recall any certain inciting incident like installing an unreputable 3rd party app or hardware failure but just recently my system won't wake after it's been left idle longer that 20 minutes and goes to sleep. I've also had problems launch certain apps, like Adobe InDesign CS3, that just basically freeze the system but leave mouse movements functioning for a short time before freezing the entire system which requires a hard reboot to resolve the freeze. The system seems to run normally when used but I fear there's a looming possibility that this is a house of cards and will all come crashing down soon. My question is this, can Windows Explorer be repaired/fixed? Before reformatting the system and starting over, which is most likely what I'm going to be forced to do, I'd like explore (no pun intended) my options in fixing the problem with a patch or reinstall or something of that nature. Reformatting my system will eat up a day or two of my time and I just don't have the time to spare right now.

    Read the article

  • Workflow versioning

    - by Nitra
    I believe I have a fundamental misunderstanding when it comes to workflow engines which I would appreciate if you could help me sort out. I'm not sure if my misunderstanding is specific to the workflow engine I'm using, or if it's a general misunderstanding. I happen to use Windows Workflow Foundation (WWF). TLDR-version WWF allows you to implement business processes in long-running workflows (think months or even years). When started, the workflows can't be changed. But what business process can't change at any time? And if a business process changes, wouldn't you want your software to reflect this change for already started 'instances' of the business process? What am I missing? Background In WWF you define a workflow by combining a set of activites. There are different types of activities - some of them are for flow control, such as the IfElseActivity and the WhileActivty while others allows you to perform actual tasks, such as the CodeActivity wich allows you to run .NET code and the InvokeWebServiceActivity which allows you to call web services. The activites are combined to a workflow using a visual designer. You pretty much drag-and-drop activities from a toolbox to a designer area and connect the activites to each other. The workflow and activities have input paramters, output parameters and variables. We have a single workflow which sometimes runs in a matter of a few days, but it may run for 5-6 months. WWF takes care of persisting the workflow state (what activity are we currently executing, what are the variable values and so on). So far I think WWF makes sense. Some people will prefer to implement a software representation of a business process using a visual designer over writing all of it in code. So what's the issue then? What I don't really get is the following: WWF is designed to take care of long-running workflows. But at the same time, WWF has no built-in functionality which allows you to modify the running workflows. So if you model a business process using a workflow and run that for 6 months, you better hope that the business process does not change. Because if it do, you'll have to have multiple versions of the workflow executing at the same time. This seems like a fundamental design mistake to me, but at the same time it seems more likely that I've misunderstood something. For us, this has had some real-world effects: We release new versions every month, but some workflows may run for a year. This means that we have several versions of the workflow running in parallell, in other words several versions of the business logics. This is the same as having many differnt versions of your code running in production in the same system at the same time, which becomes a bit hard to understand for users. (depending on on whether they clicked a 'Start' button 9 or 10 months ago, the software will behave differently) Our workflow refers to different types of entities and since WWF now has persisted and serialized these we can't really refactor the entities since then existing workflows can't be resumed (deserialization will fail We've received some suggestions on how to handle this When we create a new version of the workflow, cancel all running workflows and create new ones. But in our workflows there's a lot of manual work involved and if we start from scratch a lot of people has to re-do their work. Track what has been done in the workflow and when you create a new one skip activites which have already been executed. I feel that this alternative may work for simple workflows, but it becomes hairy to automatically figure out what activities to skip if there's major refactoring done to a workflow. When we create a new version of the workflow, upgrade old versions using the new WWF 4.5 functionality for upgrading workflows. But then we would have to skip using the visual designer and write code to inject activities in the right places in the workflow. According to MSDN, this upgrade functionality is only intended for minor bug fixes and not larger changes. What am I missing?

    Read the article

  • HP G61 Laptop wont boot- display stays off, caps and num lock indicators blink repeatedly

    - by Benguy12
    I had my HP G61 laptop running in sleep for a while. When I came back to it about a half-hour later, it was no longer in sleep mode - the power light and the Wi-Fi indicator light were on (I keep Wi-Fi off becuase I use a wired connection) - but nothing was showing on screen. In fact, the display wasn't even turned on. So I let it sit for about 10 minutes but nothing happened. I did a force shut down and rebooted. Instead of a normal boot, the display didnt turn on, the Wi-Fi indicator was off, and the Caps Lock and Num Lock lights just blinked repeatedly. On the external keyboard i use, none of the light indicators were blinking or even on. I tried force shut-down again 10 times, then unplugged all connections except for the power cable (my laptop battery dosent hold a charge for more than 2 minutes, so I always must have a wall connection) and tried to boot again but still nothing happened. I unplugged the battery and even then nothing happened. I also tried booting with the disk drive open, and then with it closed again. On the time it was closed, I was able to successfully boot into Windows, but recieved a "Windows did not shut-down sucessfully" notice. Does anybody know why this may have happened? My PC's specs: Windows 7 Home Premium, 64-bit 4GB of physical RAM, 8GB of vRAM (on a flash drive) AMD Vision x64 processor (don't know any other specs about it) ATI Radeon graphics card, 392 MB DVD-R/W lightscribe drive 2 External hard-disks (first one is 1.5TB, second one is 1TB) custom boot-screen and boot-annimation Standard BIOS apps running before sleep: firefox 10.4 itunes 10.6 adobe photoshop extended CS5.1 rockstar games social club (running in background) microsoft powerpoint 2010 professional edition google chrome I was NOT running Aero or any fancy themes - I was using the normal windows classic theme. I have a desktop icon manager application called Stardock Fences that was also running (it runs as a service/process).

    Read the article

  • Disappearing Arial on 2 Macs

    - by drewk
    I noticed that Safari started rendering common web pages in a funny manner on two different Macs that I have. One is a Macbook Pro and the other a Mac Pro desktop. Yahoo and Google would appear all excessively bold or all italic and not at all look right or acceptable. The computers are all running OS X 10.6.3 "Snow Leopard" Turns out that "Arial.TTF" and "Arial Bold.ttf" got deleted somehow on these two computers. I restored Arial through Font Book and got my web mojo back. So questions: 1) Anyone seen "arial" strangely randomly disappear? The only thing in common is these are the only two computers out of eight on site that recently got Adobe CS 5 installed. Has anyone had CS 5 delete arial? 2) When I restored arial with font book, it goes into User fonts rather than the All fonts. Can I use Font Book to restore a font in /System/Library/Fonts or do I need to do that manually? 3) I located THIS article on the web regarding OS X fonts. Essentially, Snow Leopard did away with the older .dfont format and replaced with open True Type. There is a minimum font list, but Arial is not among them. Arial is installed by MS Office. 4) Why are web sites affected by Arial being missing anyway? If I look at the HTML source for Yahoo for example, "arial" is specified by name only in an ad. Yahoo itself does not specify a font name. In my Safari preferences, I have Times and Courier specified as the default font which is the default for Safari when installed. How does a missing Arial screw things up anyway? Thanks in advance.

    Read the article

  • Bitmap font rendering, UV generation and vertex placement

    - by jack
    I am generating a bitmap, however, I am not sure on how to render the UV's and placement. I had a thread like this once before, but it was too loosely worded as to what I was looking to do. What I am doing right now is creating a large 1024x1024 image with characters evenly placed every 64 pixels. Here is an example of what I mean. I then save the bitmap X/Y information to a file (which is all multiples of 64). However, I am not sure how to properly use this information and bitmap to render. This falls into two different categories, UV generation and kerning. Now I believe I know how to do both of these, however, when I attempt to couple them together I will get horrendous results. For example, I am trying to render two different text arrays, "123" and "njfb". While ignoring the texture quality (I will be increasing the texture to provide more detail once I fix this issue), here is what it looks like when I try to render them. http://img64.imageshack.us/img64/599/badfontrendering.png Now for the algorithm. I am doing my letter placement with both GetABCWidth and GetKerningPairs. I am using GetABCWidth for the width of the characters, then I am getting the kerning information for adjust the characters. Does anyone have any suggestions on how I can implement my own bitmap font renderer? I am trying to do this without using external libraries such as angel bitmap tool or freetype. I also want to stick to the way the bitmap font sheet is generated so I can do extra effects in the future. Rendering Algorithm for(U32 c = 0, vertexID = 0, i = 0; c < numberOfCharacters; ++c, vertexID += 4, i += 6) { ObtainCharInformation(fontName, m_Text[c]); letterWidth = (charInfo.A + charInfo.B + charInfo.C) * scale; if(c != 0) { DWORD BytesReq = GetGlyphOutlineW(dc, m_Text[c], GGO_GRAY8_BITMAP, &gm, 0, 0, &mat); U8 * glyphImg= new U8[BytesReq]; DWORD r = GetGlyphOutlineW(dc, m_Text[c], GGO_GRAY8_BITMAP, &gm, BytesReq, glyphImg, &mat); for (int k=0; k<nKerningPairs; k++) { if ((kerningpairs[k].wFirst == previousCharIndex) && (kerningpairs[k].wSecond == m_Text[c])) { letterBottomLeftX += (kerningpairs[k].iKernAmount * scale); break; } } letterBottomLeftX -= (gm.gmCellIncX * scale); } SetVertex(letterBottomLeftX, 0.0f, zFight, vertexID); SetVertex(letterBottomLeftX, letterHeight, zFight, vertexID + 1); SetVertex(letterBottomLeftX + letterWidth, letterHeight, zFight, vertexID + 2); SetVertex(letterBottomLeftX + letterWidth, 0.0f, zFight, vertexID + 3); zFight -= 0.001f; float BottomLeftX = (F32)(charInfo.bitmapXOrigin) / (float)m_BitmapWidth; float BottomLeftY = (F32)(charInfo.bitmapYOrigin + charInfo.charBitmapHeight) / (float)m_BitmapWidth; float TopLeftX = BottomLeftX; float TopLeftY = (F32)(charInfo.bitmapYOrigin) / (float)m_BitmapWidth; float TopRightX = (F32)(charInfo.bitmapXOrigin + charInfo.B - charInfo.C) / (float)m_BitmapWidth; float TopRightY = TopLeftY; float BottomRightX = TopRightX; float BottomRightY = BottomLeftY; SetTextureCoordinate(TopLeftX, TopLeftY, vertexID + 1); SetTextureCoordinate(BottomLeftX, BottomLeftY, vertexID + 0); SetTextureCoordinate(BottomRightX, BottomRightY, vertexID + 3); SetTextureCoordinate(TopRightX, TopRightY, vertexID + 2); /// index setting letterBottomLeftX += letterWidth; previousCharIndex = m_Text[c]; }

    Read the article

  • XNA RenderTarget2D Sample

    - by Michael B. McLaughlin
    I remember being scared of render targets when I first started with XNA. They seemed like weird magic and I didn’t understand them at all. There’s nothing to be frightened of, though, and they are pretty easy to learn how to use. The first thing you need to know is that when you’re drawing in XNA, you aren’t actually drawing to the screen. Instead you’re drawing to this thing called the “back buffer”. Internally, XNA maintains two sections of graphics memory. Each one is exactly the same size as the other and has all the same properties (such as surface format, whether there’s a depth buffer and/or a stencil buffer, and so on). XNA flips between these two sections of memory every update-draw cycle. So while you are drawing to one, it’s busy drawing the other one on the screen. Then the current update-draw cycle ends, it flips, and the section you were just drawing to gets drawn to the screen while the one that was being drawn to the screen before is now the one you’ll be drawing on. This is what’s meant by “double buffering”. If you drew directly to the screen, the player would see all of those draws taking place as they happened and that would look odd and not very good at all. Those two sections of graphics memory are render targets. All a render target is, is a section of graphics memory to which things can be drawn. In addition to the two that XNA maintains automatically, you can also create and set your own using RenderTarget2D and GraphicsDevice.SetRenderTarget. Using render targets lets you do all sorts of neat post-processing effects (like bloom) to make your game look cooler. It also just lets you do things like motion blur and lets you create mirrors in 3D games. There are quite a lot of things that render targets let you do. To go along with this post, I wrote up a simple sample for how to create and use a RenderTarget2D. It’s available under the terms of the Microsoft Public License and is available for download on my website here: http://www.bobtacoindustries.com/developers/utils/RenderTarget2DSample.zip . Other than the ‘using’ statements, every line is commented in detail so that it should (hopefully) be easy to follow along with and understand. If you have any questions, leave a comment here or drop me a line on Twitter. One last note. While creating the sample I came across an interesting quirk. If you start by creating a Windows Game, and then make a copy for Windows Phone 7, the drop-down that lets you choose between drawing to a WP7 device and the WP7 emulator stays grayed-out. To resolve this, you need to right click on the Windows Phone 7 version in the Solution Explorer, and choose “Set as StartUp Project”. The bar will then become active, letting you change the target you which to deploy to. If you want another version to be the one that starts up when you press F5 to start debugging, just go and right-click on that version and choose “Set as StartUp Project” for it once you’ve set the WP7 target (device or emulator) that you want.

    Read the article

  • HD video editing system with Truecrypt

    - by Rob
    I'm looking to do hi-def video editing and transcoding on an unencrypted standard partition, with Truecrypt on the system partition for sensitive data. I'm aiming to keep certain data private but still have performance where needed. Goals: Maximum, unimpacted, performance possible for hi-def video editing, encryption of video not required Encrypt system partition, using Truecrypt, for web/email privacy, etc. in the event of loss In other words I want to selectively encrypt the hard drive - i.e. make the system partition encrypted but not impact the original maximum performance that would be available to me for hi-def/HD video editing. The thinking is to use an unencrypted partition for the video and set up video applications to point at that. Assuming that they would use that partition only for their workspace and not the encrypted system partition, then I should expect to not get any performance hit. Would I be correct? I guess it might depend on the application, if that app is hard-wired to use the system partition always for temporary storage during editing and transcoding, or if it has to be installed on the C: system partition always. So some real data on how various apps behave in the respect would be useful, e.g. Adobe, Cyberlink, Nero etc. etc. I have a Intel i7 Quad-core (8 threads) 1.6Ghz (up to 2.8Ghz turbo-boost) 4Gb, 7200rpm SATA, nvidia HP laptop. I've read the excellent posting about the general performance impact of truecrypt but the benchmarks weren't specific enough for my needs where I'm dealing with HD-video and using a non-encrypted partition to maintain max performance.

    Read the article

  • Windows 7 Printing Issue

    - by Adrian Godong
    I am using Windows 7 RTM x64. From Control Panel Devices and Printers, I have three printers listed; Fax, XPS Writer, and a Lexmark. I can print a test page through the printer properties with no problem. I can print a text file from Notepad with no problem. I can't print from Safari. When I press Ctrl+P, it displays the Print dialog, press OK and nothing happened. I can't print from Adobe Reader. When I press Ctrl+P, it complains that it there is no printer installed. I can't print from Office applications. When I press Ctrl+P, it crashes immediately. Running Office Diagnostics does not help. I can't print from IE8. When I press Ctrl+P, it displays the Print dialog, complains that I have to select a printer from the list, selected any of the three printers, the Print button is disabled. Any help? Update (01/11/2009): The default printer is the Lexmark. I'm testing on this one as well. I was about to reinstall Office (as this is the first application that has the problem), but then I tried other, some behave similarly but not identical (maybe caused by different printing implementation). On those applications that is able to display printer selection dialog, I tried the Lexmark and XPS. Neither printed anything (paper for Lexmark, file for XPS). Update (01/12/2009): It seems that my Windows installation is botched. A colleague have similar hardware/software combination (it's the same workstation model and Windows 7 x64) and his can print perfectly fine. I tried adding the printer from his share, no joy. I can test print from the printer property, I can print from Notepad, but not from any other application.

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >