Jump to content

Wicpar

Alpha Tester
  • Posts

    208
  • Joined

  • Last visited

Everything posted by Wicpar

  1. Wicpar

    Cities

    the best compromise would be to drop a blueprint copy, and the user that died has a temporary inability to use the blueprint.
  2. this is an amazing idea, i would add that you would have modules that allow you to intercept and modify packets, and do spying stuff
  3. the only calculations that are expensive are the collision detections.Torque, rotation, speed, acceleration are cheap as hell, it would be stupid to remove such fast calculations for optimization.It would be more productive trying to improve the algorythms.
  4. What is fun about waiting? Making people be able to impact their skills is more fun than letting them wait.
  5. i code asm on the console. try harder?
  6. They may grind at the srt but that exercise is futile if you gain more with autoskill than by grinding at the same time. Best way to not let this become a system of slow progression, heigher tier tasks and productions would make you gain more skill points thus forcing you to constantly change activity thus making your concerns invalid. My position will always be the one of full control, and autoskill in itself is quite frustrating if i must say. If the tiers give exponentially give more skill points, and skill gain per tier is logarytmic, the progression becomes linear. Wich is more natural... but then it will create that problem where you are unable to go near to veterans. Thus keeping a linear skill gain and a log skill requirement leaves a log progression that is not slowing down too fast so it keeps things interresting.
  7. I would concurr on half. each ore has a different random chemical composition. after that you can extract all different materials in pure form and then synthesize the molecules to generate the materials you need to build the ships. this would not only be quite a depthening experience, but could also tech people about chemistry and even help researchers discover new molecules if the system takes in account actual physics. (like eve did with the cell thing, but in a useful in game way.) But we can go further, instead of requiring one precise molecule, you could need a family of molecules that have different stats depending on the variations, and thus let you the ability to min max it to make the best products you can, imagine a game where there is actual scientific-ish research involved as a game mechanic!
  8. have you ever programmed glsl? glsl has a very limited subset of functions that can be sandboxed if necessary, look at shadertoy. glsl in itself cannot do anything at all, really, it can just tell, you gave me this input and i give you that output. but if you play with the rendering pipeline, then it becomes dangerous, and even then, only graphics could be altered. Lua can be way more dangerous if badly integrated, as it directly controls the cpu. glsl is sandboxed by nature because of its high level (it is compiled to a lower level, but there is no way to act upon it) but has no action on the hardware or memory itself, it just determines a readonly process. your fear of the unknown is irrational.
  9. Individuality is irrelevant

  10. Wicpar

    Sensors

    Even that can be fooled. The problem is you can ever only have ontrol of heigher levels and never lower. For instance the stack looks like this: reality > hardware > kernel > os > programm > cloud Here you want the cloud to be in control of the program and you cannot be 100% in control which is bad. But: if NQ computes its physics on the cloud, then it can only send the relevany information. Server side computing is inevitable if you want security. And it is feasible. We in 2016 right?
  11. Wicpar

    Sensors

    Subspace magic of course ^^. But you can do without by detecting techyons, a theroetical superliminal particle. If we force warp drives to generate those it would be quite prectical, but we should think of a twist, something that you can to to hide it or play it. Best solution is the simple one: a tachion generator that can emulate a tachyon source in its vincinity to make fake big attacks or scare people etc. Cloacks should also emit tachyons so you can do really interresting psychological warfare.
  12. I really have to disagree. In your analysis you assume the skills increase linearly, wich is bound to result in that behavior. But if you have a logarithmic progression there is less and less reward to grinding, making it useless after some time. Imagine it as a max level without a limit, that is the property of the logarithmic curve. Levels wouldn't unlock new stuff, it would increase stats in the produced things. This said a veteran will always have an advantage over a newb, but the gap gets smaller and smaller with time. Just type f(x) = log(x) - log(x - 10) in google too see the gap curve. A vet will always be able to make the best but the gap becomming smaller will decrease the demand for its products thus regulating the market to not make it balanced in the profit area.
  13. i really hope NQ decides to make resources come in big bunches but are hard to find, this makes it worth while to install a mining post, but also decreases the need for resource regeneration as you would need a certain period of time to fully mine a vein. If there is one way to handle mining i have never seen a better system that Terrafirmacraft created.
  14. i concur, but the increase of stats has to be logarithmic to not create too many one shot weapons or invincible stuff for low levels. logarithms are the most natural progression curve in these matters (a weird scientific fact, many stats show logarithmic tendencies IRL).
  15. There is an alternative, that depthens quite a lot the system: passive train things normally, but there's a twist: every time you do an action in a certain domain, you earn a tiny amount of skill points in it, allowing you to level up as you use your skills, which is way more natural than selecting abstract skills because you may not be certain what skill points you will need. the passive generation should remain as you may wanna eventually change activity, and thus be able to skip the lower steps of this. skills should increase in a logarithmic way to be able to accustom newer player to smaller skill gap after some time.
  16. We are borg. We are perfect. We are efficient. Zombies are irrelevant.
  17. yes, it is true, but if my theory that gravity is caused by the bending of space-time, it can generate an acceleration to the area where there is more space, as matter has a tendency to expand where there is more room, like a gas, thus creating some amount of acceleration. But it is speculation.
  18. Eve has its targeting system at the core mechanics, and it was made 13 years ago. Network latency can be mitigated by taking in account the ping of people. If and only if the input is sent to the server that makes the calculations and sends the results back for render, then you can create a 1s forced latency for every input. it creates a queue and compensates relative to the ping at the moment of reception. But that system can be exploited by hackers easily by artificially increasing and decreasing ping, thus falsifying the latency. The solution to it is to send a random unique hash to every update tick and use that as an encryption key to send back the input, nothing too fancy, just a way to make input be invalid if cheating is attempted. If warp engine work like i think, you may do jumps to a certain range of vessels (20-30 km) and thus avoid being shot at 300, 400 km away. Additionally, there would still be imprecision in the shots (like a montecarlo method ray) like in every gun, but extremely low (would need double precision for a normalized vector, or use scaled up ones). You would have full control on the shot properties: speed, size, force etc... like in From The Depths (which uses the cpu for physics so quite sow for a lot of shots). so you could have your average shot go 10 km/s or 100km/s depending on your precision and enemy maneuverability, you would always have a way to hit. Physically based targeting add such a depth in the game it could create 3 categories of jobs (gun scripter, gun balancer and gun space optimizer), requirements of different types of guns in the bigger ships (big ones who are slow but powerful, weak ones that are extremely fast) and all that can be fine tuned. the best way to but it is like the difference between a linear function (target and dice roll) and a Gaussian curve matrix (physical targeting). As i have demonstrated the performance and the latency are irrelevant for NQ, compared to the depth added by that feature, i understand there is a high dev cost involved, but this is still the start and as a dev, it is not out of reach of current technology, and affordability, even in the worst scenarios. NQ relies on the high praise of the revolutionary tech, and this feature would be truly revolutionary at this scale, and in itself would be enough to praise Dual. FFS emulation is like american cheese: Bland. why? because you hit his toe, but you sword clearly cut his head off. The problem is that it does not reflect reality and that is frustrating. You get used to it tho, but why endure it when you can be in control. But why have voxels when you can have beautifully detailed static meshes right? NQ has made it clear this is not their state of mind and global direction. most bullet based systems do emulate EVERY shot, but they usually bake them. They throw a ray at the start and then move it along it. it allows for quite an optimization, but it is simulated. Some do even instant ray hit like CS:GO, and then draw it, but you see how people complain about the glitches that this causes. What concerns rendering, it is the cheapest part. today you can render tens of millions of particles with their physics (no collision) on your average gpu. How would you render those shots? well, today you can send the GPU an array of dots with a vector, and tell him to render a mesh on it, and that is nearly as fast as rendering standard particles. The hard part today is that there is high latency/broadband limitations between the cpu and the GPU thus making data transfer the most expensive. You mostly have to worry about sending the least data as little as possible. And particles are cheap as hell (per particle) because a GPU is designed to do the same thing a lot of times in a row. rendering Voxels would be the most exensive part of this game honestly, as you have to often rebuild the mesh from the octree, but that can be done asynchronously thus isn't too relevant. Most of you arguments here are based on fear and ignorance. "let's stay safe and do as always" and that is definitely not the way to go. NQ is here to innovate, not be limited to existant working systems. If they want to succeed they have to put their ideals before the product, as we users most instinctively reason with emotions, and thus are more acquainted with the ideals rather than the product. This seems contradictory, but it isn't, you have to follow your ideals and rules if you want to be taken seriously thus applying your ideals in your product. For instance, that is how apple got so popular: "think different" is not a product, but an ideal, but they happen to have a product that conforms to it, and sold it out to hell. But now that Jobs is dead no one has maintained that, and see how bad they are doing, how their new products become old and boring rapidly, or are even stupid from the start. I may not be able to express it fully, but that example shows how important this is, in its core.
  19. Wicpar

    Sensors

    That idea is great, but please note that sensors are not using magic to detect, they use EM waves, and those can be absorbed and/or reflected and/or detected. In the case of detection, pure passive scanners (like cameras basically) can be used to prevent detection, but those give quite lower resolution or ark of detection. You can detect active scanners with active and passive scanners, because those light up in the direction they detect in.
  20. if they are anyway near to make a 2000 player battle, they surely can have an array of GPUs: 13 * 2000 = 26000 = ~20 nvidia Titan X which has a total of 20 * 3584 cuda cores which can calculate the physics of a whopping 71680 bullets physics simultaneously: 2000 instructions per shot physics tick (very generous estimation) so it can calculate 1417000(base clock so probably even higher) / 2000 * 71680 = 50785280 theoretical bullet physics per second. which would be half for a 1000 player battle. (memory is negligible as only the relevant physics voxels will be loaded in vram, which would hardly approach 12 gigs) You, my sir, have ludicrous claims. Have a nice being destroyed (no offense, just for the drama ). EDIT: I have mistaken (yes self destruction), one gpu cycle does not coincide with one instruction, and thus will more likely be in the realm of 10 million theoretical bullet physics per second because of the 6 instruction stack size for 20 titan X, please note tho this only describes latency, and such a scenario can only happen if the program is really badly made and compiled. and will such be still around 50M
  21. space may be bent, but depending on the geometry it either generates acceleration, or multiplies speed.
  22. such planets are possible since voxels are not necessarely only one meter wide, they are likely organized in an octtree and thus spare a lot of memory. additionally, if the procedural terrain generation is consistent, it can be used to fill in non modified voxels and thus only saving modified ones, and reducing the memory usage to a few megabytes, even massive ones. (every time someone approaches the planet would have to be baked again, but nothing to expensive).
  23. that is not true, the warp drive works by removing space before you and putting it behind you, this allows for great absolute (because relatively you still accelerate slowly, but the smaller space multiplies the effect) acceleration without G forces, the only drawback is that you only can go in a straight line, as it does not reduce rotation based G-Force.
  24. To be able to simulate orbital physics correctly, you would not only need frames of reference but different voxel octtrees: one per structure. The system is simple: the universe has a plane of reference containing solar system planes of references, those are static. Solar systems store the combined center of mass of all bodies in the system, and create the orbit ellipsoids around it. the center of mass moves with the planets and thus simulate the N body system in some ways (the orbits have to be slightly changed every update but that is quite cheap as it is a change in position of the body, but not changing the energy. same goes for planets and rings/moons/asteroids etc scaling down. when it comes to planets frames of references in atmospheres, those are relatively static, (unless there is a superstructure but that will not happen in the next 2 years) and thus can live directly in a child/parent reference relationship.
×
×
  • Create New...